scispace - formally typeset
Open AccessJournal Article

Eye movement based electronic wheel chair for physically challenged persons

TLDR
This paper uses the optical-type eye tracking system to control powered wheel chair and uses this system to train the microprocessor to send signals to control the wheels and thus the overall movement.
Abstract
A powered wheel chair is a mobility-aided device for persons with moderate/severe physical disabilities or chronic diseases as well as the elderly. In order to take care for different disabilities, various kinds of interfaces have been developed for powered wheelchair control; such as joystick control, head control and sip-puff control. Many people with disabilities do not have the ability to control powered wheel chair using the above mentioned interfaces. The proposed model is a possible alternative. In this paper, we use the optical-type eye tracking system to control powered wheel chair. User‘s eye movement are translated to screen position using the optical type eye tracking system. When user looks at appropriate angle, then computer input system will send command to the software based on the angle of rotation of pupil i.e., when user moves his eyes balls up (move forward), left (move left), right (move right) in all other cases wheel chair will stop. Once the image has been processed it moves onto the second part, our microprocessor. The microprocessor will take a USB output from the laptop and convert the signal into signals that will be sent to the wheelchair wheels for movement. Also, the pressure and object detection sensors will be connected to our microprocessor to provide necessary feedback for proper operation of the wheelchair system. The final part of the project is the wheelchair itself. The rear wheels will provide forward. The front two wheels will be used for steering left and right. All four wheels will be connected to our microprocessor that will send signals to control the wheels and thus the overall movement.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Eye tracking algorithms, techniques, tools, and applications with an emphasis on machine learning and Internet of Things technologies

TL;DR: The results of this study indicate that ML and IoT are important aspects in evolving eye tracking applications owing to their ability to learn from existing data, make better decisions, be flexible, and eliminate the need to manually re-calibrate the tracker during the eye tracking process.
Journal ArticleDOI

Simulation of Eye Tracking Control based Electric Wheelchair Construction by Image Segmentation Algorithm

TL;DR: This research article discusses better accuracy achievement and minimizes the delay response time in the proposed eye-tracking EWC, which achieves an accuracy of about 90% and response time is least compared with the existing methods.
Journal ArticleDOI

Navigation-synchronized multimodal control wheelchair from brain to alternative assistive technologies for persons with severe disabilities

TL;DR: The results revealed that the prototype BCW could be operated in either of the proposed modes, and the proposed navigation system had a flexible design that could be interfaced with other assistive technologies.
Proceedings ArticleDOI

Gaze-based, Context-aware Robotic System for Assisted Reaching and Grasping

TL;DR: In this paper, a multi-modal system consisting of different sensing, decision-making and actuating modalities is presented to assist those with movement disabilities in reaching for those objects, grasping them, and using them to interact with other objects.
Journal ArticleDOI

An Intelligent and Low-cost Eye-tracking System for Motorized Wheelchair Control

TL;DR: A system to aid people with motor disabilities by restoring their ability to move effectively and effortlessly without having to rely on others utilizing an eye-controlled electric wheelchair, and provides low-cost tools for the organization assisting wheelchair users.
References
More filters
Journal ArticleDOI

A tutorial on hidden Markov models and selected applications in speech recognition

TL;DR: In this paper, the authors provide an overview of the basic theory of hidden Markov models (HMMs) as originated by L.E. Baum and T. Petrie (1966) and give practical details on methods of implementation of the theory along with a description of selected applications of HMMs to distinct problems in speech recognition.
Proceedings ArticleDOI

How iris recognition works

TL;DR: Algorithms developed by the author for recognizing persons by their iris patterns have now been tested in many field and laboratory trials, producing no false matches in several million comparison tests.
Proceedings ArticleDOI

uWave: Accelerometer-based personalized gesture recognition and its applications

TL;DR: This work evaluates uWave using a large gesture library with over 4000 samples collected from eight users over an elongated period of time for a gesture vocabulary with eight gesture patterns identified by a Nokia research and shows that uWave achieves 98.6% accuracy, competitive with statistical methods that require significantly more training samples.
Proceedings ArticleDOI

Evaluation of eye gaze interaction

TL;DR: Two experiments are presented that compare an interaction technique developed for object selection based on a where a person is looking with the most commonly used selection method using a mouse and find that the eye gaze interaction technique is faster than selection with a mouse.
Proceedings ArticleDOI

Gesture recognition with a Wii controller

TL;DR: The design and evaluation of the sensor-based gesture recognition system is presented, which allows the training of arbitrary gestures by users which can then be recalled for interacting with systems like photo browsing on a home TV.