Y
Yongtian He
Researcher at University of Houston
Publications - 17
Citations - 1496
Yongtian He is an academic researcher from University of Houston. The author has contributed to research in topics: Brain–computer interface & Gait (human). The author has an hindex of 11, co-authored 16 publications receiving 845 citations.
Papers
More filters
Journal ArticleDOI
Deep learning for electroencephalogram (EEG) classification tasks: a review.
TL;DR: Practical suggestions on the selection of many hyperparameters are provided in the hope that they will promote or guide the deployment of deep learning to EEG datasets in future research.
Journal ArticleDOI
Brain-machine interfaces for controlling lower-limb powered robotic systems.
Yongtian He,David Eguren,José M. Azorín,José M. Azorín,Robert G. Grossman,Trieu Phat Luu,Jose L. Contreras-Vidal +6 more
TL;DR: It is concluded that lower-body powered exoskeletons with automated gait intention detection based on BMIs open new possibilities in the assistance and rehabilitation fields, although the current performance, clinical benefits and several key challenging issues indicate that additional research and development is required to deploy these systems in the clinic and at home.
Journal ArticleDOI
Powered exoskeletons for bipedal locomotion after spinal cord injury
Jose L. Contreras-Vidal,Nikunj A. Bhagat,Justin A. Brantley,Jesus G. Cruz-Garza,Yongtian He,Quinn Manley,Sho Nakagome,Kevin Nathan,Su H Tan,Fangshi Zhu,Jose L Pons +10 more
TL;DR: A systematic review of the literature shows that a majority of current studies focus on thoracic level injury as well as there is an emphasis on ambulatory-related primary outcome measures.
Journal ArticleDOI
Risk management and regulations for lower limb medical exoskeletons: a review.
TL;DR: There is a need to raise awareness of probable risks associated with the use of powered exoskeletons and to develop adequate countermeasures, standards, and regulations for these human-machine systems.
Journal ArticleDOI
Gait adaptation to visual kinematic perturbations using a real-time closed-loop brain-computer interface to a virtual reality avatar.
TL;DR: The results demonstrate the feasibility of using a closed-loop BCI to learn to control a walking avatar under normal and altered visuomotor perturbations, which involved cortical adaptations.