scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Autonomous camera based eye controlled wheelchair system using raspberry-pi

19 Mar 2015-pp 1-6
TL;DR: A novel technique is implemented for the eye controlled based independent and cost effective system that will allow the disabled person to control the wheelchair without the assistance from other persons.
Abstract: A novel technique is implemented for the eye controlled based independent and cost effective system. The purpose of Eye movement based control electric wheelchair is to eliminate the necessity of the assistance required for the disabled person. And it provides great opportunity of the disabled to feel of independent accessible life. The implemented system will allow the disabled person to control the wheelchair without the assistance from other persons. In this system controlling of wheelchair carried out based on Eye movements. The camera is mounted in front of the user, to capture the image of any one of the Eye (either left or right) and tracks the position of eye pupil with the use of Image processing techniques. According to the position of the eye, wheelchair motor will be directed to move left, right and forward. In addition to this, for the safety purpose ultrasonic sensor is mounted in front of wheelchair to detect the obstacles and automatically stop the wheelchair movement. To make system cost effective for monitoring, a Raspberry pi board allowed to access the system without displaying unit.
Citations
More filters
Journal ArticleDOI
08 Apr 2021
TL;DR: This research article discusses better accuracy achievement and minimizes the delay response time in the proposed eye-tracking EWC, which achieves an accuracy of about 90% and response time is least compared with the existing methods.
Abstract: In this fast-paced world, it is very challenging for the elderly and disabled population to move independently to their desire places at any convenient time. Fortunately, some of the people have good eyesight and physically strong to take care of their survival. Nevertheless, Electric wheelchair (EWC) can provide them a better lifestyle with commendable confidence. At Journal of Innovative Image Processing (JIIP) (2021) Vol.03/ No. 01 Pages: 21-35 https://www.irojournals.com/iroiip/ DOI: https://doi.org/10.36548/jiip.2021.1.003 22 ISSN: 2582-4252 (online) Submitted: 26.02.2021 Revised: 11.03.2021 Accepted: 24.03.2021 Published: 08.04.2021 the same time, the hand, head and voice recognition-based EWC meet many limitations. Despite, the eye-tracking-based EWC provides a better smartness in their lifestyle. This research article discusses better accuracy achievement and minimizes the delay response time in the proposed system. The proposed eye-tracking EWC is differed from another existing system with good validation parameters of the controller and it introduces edge detection to identify the eye pupil position in the face. The proposed method includes a PID controller to control the DC motor, which in turn controls the rotation of wheel in EWC. This research article is mainly focused on the costeffectiveness and improvement in the system performance. The display system is mounted in front of the sitting position of EWC users. The camera captures eye pupil position and it determines the direction of the EWC movement by controlling DC motor with the help of a PID controller. When derivative (D) control is used in the proposed system, the system response is quite faster and it reduces the delay time between the user and system reaction. This pupil of eye position is determined by a canny edge detector, which provides good results when compared with other edge detection approaches. Object detection in front of the EWC is an added advantage of the proposed system. The proposed article integrates all the activities and measures the system performance. The proposed model achieves an accuracy of about 90% and response time is least compared with the existing methods.

46 citations


Cites background from "Autonomous camera based eye control..."

  • ...This EWC can move in 4 directions with a limited direction of the rotation of the wheel in the chair [21]....

    [...]

Journal ArticleDOI
TL;DR: A system to aid people with motor disabilities by restoring their ability to move effectively and effortlessly without having to rely on others utilizing an eye-controlled electric wheelchair, and provides low-cost tools for the organization assisting wheelchair users.
Abstract: In the 34 developed and 156 developing countries, there are about 132 million disabled people who need a wheelchair constituting 1.86% of the world population. Moreover, there are millions of people suffering from diseases related to motor disabilities, which cause inability to produce controlled movement in any of the limbs or even head.The paper proposes a system to aid people with motor disabilities by restoring their ability to move effectively and effortlessly without having to rely on others utilizing an eye-controlled electric wheelchair. The system input was images of the users eye that were processed to estimate the gaze direction and the wheelchair was moved accordingly. To accomplish such a feat, four user-specific methods were developed, implemented and tested; all of which were based on a benchmark database created by the authors.The first three techniques were automatic, employ correlation and were variants of template matching, while the last one uses convolutional neural networks (CNNs). Different metrics to quantitatively evaluate the performance of each algorithm in terms of accuracy and latency were computed and overall comparison is presented. CNN exhibited the best performance (i.e. 99.3% classification accuracy), and thus it was the model of choice for the gaze estimator, which commands the wheelchair motion. The system was evaluated carefully on 8 subjects achieving 99% accuracy in changing illumination conditions outdoor and indoor. This required modifying a motorized wheelchair to adapt it to the predictions output by the gaze estimation algorithm. The wheelchair control can bypass any decision made by the gaze estimator and immediately halt its motion with the help of an array of proximity sensors, if the measured distance goes below a well-defined safety margin.

25 citations


Cites methods from "Autonomous camera based eye control..."

  • ...rted. In addition, safety parameters of the wheelchair’s movement, such as ultrasound or IR sensors for obstacles detection, were not discussed. Another wheelchair control system has been proposed in [35], where positions of the eye pupil were tracked by employing image processing techniques using a Raspberry-Pi board and a motor drive to steer the chair to left, right, or forward directions. The open...

    [...]

Proceedings ArticleDOI
01 Dec 2017
TL;DR: A wheelchair system that can be completely controlled with eye movements and blinks that uses deep convolutional neural networks for classification and demonstrates the significant improvement in performance over traditional image processing algorithms for the same.
Abstract: Traditional wheelchair control is very difficult for people suffering from quadriplegia and are hence, mostly restricted to their beds Other alternatives include Electroencephalography (EEG) based and Electrooculography (EOG) based automatic wheelchairs which use electrodes to measure neuronal activity in the brain and eye respectively These are expensive and uncomfortable, and are almost impossible to procure for someone from a backward economy We present a wheelchair system that can be completely controlled with eye movements and blinks that uses deep convolutional neural networks for classification We have developed a working prototype based on only a small video camera and a microprocessor that shows upwards of 99% accuracy We also demonstrate the significant improvement in performance over traditional image processing algorithms for the same This will allow such patients to be more independent in their day to day lives and significantly improve quality of life at an affordable cost

21 citations


Cites methods from "Autonomous camera based eye control..."

  • ...The most popular image processing method for tracking eyeball uses Hough transform [2], [3]....

    [...]

  • ...A different approach is to capture eye images with a headset as in [2]....

    [...]

Journal ArticleDOI
15 Jul 2020-Sensors
TL;DR: In this paper, the authors proposed a system to aid people with motor disabilities by restoring their ability to move effectively and effortlessly without having to rely on others utilizing an eye-controlled electric wheelchair.
Abstract: In the 34 developed and 156 developing countries, there are ~132 million disabled people who need a wheelchair, constituting 1.86% of the world population. Moreover, there are millions of people suffering from diseases related to motor disabilities, which cause inability to produce controlled movement in any of the limbs or even head. This paper proposes a system to aid people with motor disabilities by restoring their ability to move effectively and effortlessly without having to rely on others utilizing an eye-controlled electric wheelchair. The system input is images of the user's eye that are processed to estimate the gaze direction and the wheelchair was moved accordingly. To accomplish such a feat, four user-specific methods were developed, implemented, and tested; all of which were based on a benchmark database created by the authors. The first three techniques were automatic, employ correlation, and were variants of template matching, whereas the last one uses convolutional neural networks (CNNs). Different metrics to quantitatively evaluate the performance of each algorithm in terms of accuracy and latency were computed and overall comparison is presented. CNN exhibited the best performance (i.e., 99.3% classification accuracy), and thus it was the model of choice for the gaze estimator, which commands the wheelchair motion. The system was evaluated carefully on eight subjects achieving 99% accuracy in changing illumination conditions outdoor and indoor. This required modifying a motorized wheelchair to adapt it to the predictions output by the gaze estimation algorithm. The wheelchair control can bypass any decision made by the gaze estimator and immediately halt its motion with the help of an array of proximity sensors, if the measured distance goes below a well-defined safety margin. This work not only empowers any immobile wheelchair user, but also provides low-cost tools for the organization assisting wheelchair users.

10 citations

Book ChapterDOI
01 Jan 2018
TL;DR: The developed model uses the eye-tracking technique through circular Hough transform algorithm to control the movement of the wheelchair and is designed to detect the obstacles using ultrasonic sensors.
Abstract: In today’s world, people suffering from various disability problems is rising and more concernedly with quadriplegics (People, who are unable to walk in and around). To enhance their confidence and life independent, we have developed an effective alternative solution. The developed model uses the eye-tracking technique through circular Hough transform algorithm to control the movement of the wheelchair. The camera mounted aligns with the eye of the patient and captures continuous snapshots which are processed by image processing techniques in real time which, in turn, controls the direction of movement. Along with the control of motion of the wheelchair, this model also designed to detect the obstacles using ultrasonic sensors.

9 citations

References
More filters
Journal ArticleDOI
TL;DR: The smart chair is developed, a smart wheelchair with intelligent controllers that lets people with physical disabilities overcome these difficulties by outfitting the wheelchair with cameras, a laser range finder, and onboard processing to give the user an adaptable, intelligent control system.
Abstract: Nearly five million individuals in the US have limited arm and hand movement, making it difficult or impossible for them to use computers and products with embedded computers, such as wheelchairs, household appliances, office electronic equipment, and robotic aids. Although some current wheelchair systems have embedded computers, they have very little computer control and require precise, low-level control inputs from the user; interfaces are similar to those found in passenger cars. The rider must continuously specify the chair's direction and, in some cases, velocity using a joystick-like device. Unfortunately, many users who could benefit from powered wheelchairs lack these fine motor skills. For instance, those with cerebral palsy might not be able to guide a chair through a narrow opening, such as a doorway, without repeatedly colliding into the sides. These types of physically challenging environments can be frustrating and require a lot of user effort. At the University of Pennsylvania's general robotics, automation, sensing, and perception lab, we developed the smart chair, a smart wheelchair with intelligent controllers that lets people with physical disabilities overcome these difficulties. By outfitting the wheelchair with cameras, a laser range finder, and onboard processing, we give the user an adaptable, intelligent control system. A computer-controlled wheelchair's shared control framework allows users complete control of the chair while ensuring their safety

123 citations


"Autonomous camera based eye control..." refers background in this paper

  • ...Sometime for totally paralysis person may be have very difficult to use that type of systems....

    [...]

Journal ArticleDOI
TL;DR: An electric wheelchair controlled by gaze direction and eye blinking is proposed, and an emergency stop is generated when the electric wheelchair user does not focus their gaze consistently in any direction for a specifi ed time.
Abstract: We propose an electric wheelchair controlled by gaze direction and eye blinking. A camera is set up in front of a wheelchair user to capture image information. The sequential captured image is interpreted to obtain the gaze direction and eye blinking properties. The gaze direction is expressed by the horizontal angle of the gaze, and this is derived from the triangle formed by the centers of the eyes and the nose. The gaze direction and eye blinking are used to provide direction and timing commands, respectively. The direction command relates to the direction of movement of the electric wheelchair, and the timing command relates to the time when the wheelchair should move. The timing command with an eye blinking mechanism is designed to generate ready, backward movement, and stop commands for the electric wheelchair. Furthermore, to move at a certain velocity, the electric wheelchair also receives a velocity command as well as the direction and timing commands. The disturbance observer-based control system is used to control the direction and velocity. For safety purposes, an emergency stop is generated when the electric wheelchair user does not focus their gaze consistently in any direction for a specifi ed time. A number of simulations and experiments were conducted with the electric wheelchair in a laboratory environment.

104 citations

Journal ArticleDOI
TL;DR: An Internet-based teaching and experiment system for control engineering (ITESCE) that provides students with online course material, a simulator, an online control experiment using an arm robot, and the ability to store and search simulation and experimental results.
Abstract: e-Learning engineering courses with online experiments are attracting a great deal of attention because of the flexibility they provide in both teaching and learning. This paper has described an Internet-based teaching and experiment system for control engineering (ITESCE) that provides students with online course material, a simulator, an online control experiment using an arm robot, and the ability to store and search simulation and experimental results. To implement the functions required by the course and to facilitate connection to the Internet, the ITESCE is based on a standard browser/server architecture with three layers and employs multithreading, Java applets, and Java database connectivity. Background control subsystems handle the real-time control of experiments, and a network server handles communication with clients and with background control subsystems. A database stores simulation and experimental results. The course covers a variety of control methods, and students can try them out through online simulations and experiments. To enhance realism, a Web camera takes a video of an experiment and streams it to a student's PC in real time.

60 citations


"Autonomous camera based eye control..." refers background in this paper

  • ...According to the eye pupil movements, distance will be vary....

    [...]

01 Jan 2010
TL;DR: An application of eye tracking method to wheelchairs using the coherence algorithm, which tracks the movement of eye towards left or right, and the eye blinking feature is also used by the algorithm to control the starting and stopping of wheelchair.
Abstract: This paper proposes a new algorithm, the coherence algorithm, for eye movement tracking. Researchers continue to restore some functions lost by handicapped people using some powered electrical and robotic wheelchairs. This paper presents an application of eye tracking method to such wheelchairs. The coherence algorithm tracks the movement of eye towards left or right. The eye blinking feature is also used by the algorithm to control the starting and stopping of wheelchair.

55 citations


"Autonomous camera based eye control..." refers methods in this paper

  • ...For automatically find out Eye pupil and tracking eye pupil many computer vision library of Image processing are used like object detection, motion detection, Image colour conversion, edge detection, pattern matching etc....

    [...]

Journal Article
TL;DR: A hybrid method for eyes local- ization in facial images that combines techniques that utilise colour, edge and illumination cues to improve accuracy and remove false eye regions.
Abstract: This paper proposes a hybrid method for eyes local- ization in facial images. The novelty is in combining techniques that utilise colour, edge and illumination cues to improve accuracy. The method is based on the observation that eye regions have dark colour, high density of edges and low illumination as compared to other parts of face. The first step in the method is to extract connected regions from facial images using colour, edge density and illumination cues separately. Some of the regions are then removed by applying rules that are based on the general geometry and shape of eyes. The remaining connected regions obtained through these three cues are then combined in a systematic way to enhance the identification of the candidate regions for the eyes. The geometry and shape based rules are then applied again to further remove the false eye regions. The proposed method was tested using images from the PICS facial images database. The proposed method has 93.7% and 87% accuracies for initial blobs extraction and final eye detection respectively.

37 citations


"Autonomous camera based eye control..." refers methods in this paper

  • ...One of them is Haar cascade like features detection algorithm used to detects single or multiple face and detection of both eye [3]....

    [...]