scispace - formally typeset
Search or ask a question
Author

Satoshi Sasaki

Bio: Satoshi Sasaki is an academic researcher from Meirin College. The author has contributed to research in topics: Controller (computing) & Rehabilitation engineering. The author has an hindex of 2, co-authored 3 publications receiving 6 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: The results indicated that if users set the antenna beside their cheek, the remote controller would work well and suggested the possibility that this mouthpiece type remote controller system would be effective for severely disabled people.
Abstract: A variety of operation devices have been developed for severely disabled people such as those with cervical cord injuries and/or muscular dystrophies. Each device has its own merits and demerits respectively, but there is still a need to develop other types of operation devices. In this study, we have tried to develop a mouthpiece type remote controller to operate an electric powered wheelchair. This remote controller would be inserted into user's mouth and be operated by tongue. This remote controller has passive RFID transponders but no battery. To evaluate the performance of this system, the MCR (Maximum Communication Range) was measured. The results indicated that if users set the antenna beside their cheek, the remote controller would work well. For the preliminary study, the operativeness of this remote controller was suitable. With this system, we succeeded in an operating electric powered wheelchair on the market. These results suggested the possibility that this mouthpiece type remote controller system would be effective for severely disabled people.

4 citations

Journal ArticleDOI
TL;DR: The possibility that the I-to-AS system would be effective as an assistive tool for severely disabled people is suggested.
Abstract: We have tried to develop an Integrated Tongue Operation Assistive System; \"I-to-AS\". The \"I-to-AS\" is an assistive control system used to operate powered wheelchairs (PWC), Augmentative and Alternative Communication devices and some other environment control devices, for seriously disabled people. In our preliminary study, the \"I-to-AS\" only had a mouthpiece type remote controller input device. So to add a secondary input device, we developed an intraoral mini-joystick. This has a 3-axes force sensor to detect the direction and magnitude of the operational force. The mini-joystick also has 1-axis force sensor to measure the user's biting pressure. For safety, the system will only work while normal pressure is applied on this sensor, but it won't work while abnormally low or high pressure is applied. To investigate the maneuverability of the intraoral mini-joystick, five able-bodied candidates drove a PWC by using the intraoral mini-joystick. The driving time of operation by tongue and by normal-joystick were compared. The driving time by tongue was 38.0±6.1[s]. It was about 40% slower than that of the normal-joystick but it was short enough and easy to control. These results suggest the possibility this system would be effective as an assistive tool for severely disabled people.

Cited by
More filters
Journal ArticleDOI
TL;DR: The research described herein was undertaken to develop and test a novel tongue interface based on classification of tongue motions from the surface electromyography signals of the suprahyoid muscles detected at the underside of the jaw, which requires no installation of any sensor into the mouth cavity.
Abstract: The research described herein was undertaken to develop and test a novel tongue interface based on classification of tongue motions from the surface electromyography (EMG) signals of the suprahyoid muscles detected at the underside of the jaw. The EMG signals are measured via 22 active surface electrodes mounted onto a special flexible boomerang-shaped base. Because of the sensor’s shape and flexibility, it can adapt to the underjaw skin contour. Tongue motion classification was achieved using a support vector machine (SVM) algorithm for pattern recognition where the root mean square (RMS) features and cepstrum coefficients (CC) features of the EMG signals were analyzed. The effectiveness of the approach was verified with a test for the classification of six tongue motions conducted with a group of five healthy adult volunteer subjects who had normal motor tongue functions. Results showed that the system classified all six tongue motions with high accuracy of 95.1 ± 1.9 %. The proposed method for control of assistive devices was evaluated using a test in which a computer simulation model of an electric wheelchair was controlled using six tongue motions. This interface system, which weighs only 13.6 g and which has a simple appearance, requires no installation of any sensor into the mouth cavity. Therefore, it does not hinder user activities such as swallowing, chewing, or talking. The number of tongue motions is sufficient for the control of most assistive devices.

27 citations

Proceedings ArticleDOI
03 Jul 2013
TL;DR: A real-time method for tongue movement estimation based on the analysis of the surface electromyography signals from the suprahyoid muscles, which usual function is to open the mouth and to control the position of the hyoid, the base of the tongue, is introduced.
Abstract: In this study, we introduce a real-time method for tongue movement estimation based on the analysis of the surface electromyography (EMG) signals from the suprahyoid muscles, which usual function is to open the mouth and to control the position of the hyoid, the base of the tongue. Nine surface electrodes were affixed to the underside of the jaw and their signals were processed via multi-channel EMG system. The features of the EMG signals were extracted by using a root mean square (RMS) method. The dimension of the variables was reduced additionally from 108 to 10 by applying the Principal Component Analysis (PCA). The feature quantities of the reduced dimension set were associated with the tongue movements by using an artificial neural network. Results showed that the proposed method allows precise estimation of the tongue movements. For the test data set, the identification rate was greater than 97 % and the response time was less than 0.7 s. The proposed method could be implemented to facilitate novel approaches for alternative communication and control of assistive technology for supporting the independent living of people with severe quadriplegia.

15 citations

Proceedings ArticleDOI
15 Dec 2011
TL;DR: Building a neural network estimating deglutition, yawning, and mouth opening, and introducing mask processing to reduce estimation error in voluntary tongue movement more than 95%, it is suggested precise extraction of only the signal of that movement from EMG signals obtainable from the underside of the jaw.
Abstract: With attention to voluntary tongue motion, which is capable of communicating the intentions of a person with a disability, we estimated the position and contact force of the tongue simultaneously using EMG signals of the underside of the jaw. We affixed a multi-channel electrode with nine electrodes to the underside of the jaw. Then, deriving many EMG signals using monopolar leads, we calculated 36 (= 9 C 2 ) channel EMG signals between any two of the nine electrodes. Associating these EMG signals and tongue movement using a neural network, we confirmed our ability to estimate the tongue position and contact force with precision, with a correlation coefficient greater than 0.9 and RMS error less than 10%. Furthermore, building a neural network estimating deglutition, yawning, and mouth opening, which are potential origins of false estimation, and introducing mask processing to reduce estimation error in voluntary tongue movement more than 95%, we suggest precise extraction of only the signal of that movement from EMG signals obtainable from the underside of the jaw.

11 citations

Journal ArticleDOI
TL;DR: The possibility that the I-to-AS system would be effective as an assistive tool for severely disabled people is suggested.
Abstract: We have tried to develop an Integrated Tongue Operation Assistive System; \"I-to-AS\". The \"I-to-AS\" is an assistive control system used to operate powered wheelchairs (PWC), Augmentative and Alternative Communication devices and some other environment control devices, for seriously disabled people. In our preliminary study, the \"I-to-AS\" only had a mouthpiece type remote controller input device. So to add a secondary input device, we developed an intraoral mini-joystick. This has a 3-axes force sensor to detect the direction and magnitude of the operational force. The mini-joystick also has 1-axis force sensor to measure the user's biting pressure. For safety, the system will only work while normal pressure is applied on this sensor, but it won't work while abnormally low or high pressure is applied. To investigate the maneuverability of the intraoral mini-joystick, five able-bodied candidates drove a PWC by using the intraoral mini-joystick. The driving time of operation by tongue and by normal-joystick were compared. The driving time by tongue was 38.0±6.1[s]. It was about 40% slower than that of the normal-joystick but it was short enough and easy to control. These results suggest the possibility this system would be effective as an assistive tool for severely disabled people.