scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Eye-controlled mouse cursor for physically disabled individual

TL;DR: A novel algorithm for controlling the movement of a computer screen cursor using the iris movement is presented, which enables physically disabled individuals to control the computer cursor movement to the left, right, up and down.
Abstract: This paper presents a novel algorithm for controlling the movement of a computer screen cursor using the iris movement. By accurately detecting the position of the iris in the eye and mapping that to a specific position on the computer screen, the algorithm enables physically disabled individuals to control the computer cursor movement to the left, right, up and down. The algorithm also enables the person to open and close folders or files or applications through a clicking mechanism.
Citations
More filters
Proceedings ArticleDOI
01 Jan 2019
TL;DR: A system model has been introduced to control the multimedia player with eye gaze using an economical webcam, which provides higher precision and robustness and the goal is to Control the play, pause, forward and backward functions of a multimedia player.
Abstract: Gaze-based interaction between human and computers has opened a potential domain of effortless supervision. The eye is one of the most important organs to perceive information from our surroundings and it can be the most conspicuous way to interact with computers. Therefore, in this research, a system model has been introduced to control the multimedia player with eye gaze using an economical webcam, as well as provides higher precision and robustness. The goal is to control the play, pause, forward and backward functions of a multimedia player. The click event was triggered by detecting the user’s eye blink.

4 citations


Cites methods from "Eye-controlled mouse cursor for phy..."

  • ...Rahman et al (2018) [12] also used a similar way to track user’s gaze by comparing iris center and eye corner after detecting face and eye region with MATLAB vdf tool....

    [...]

Journal Article
TL;DR: The proposed system is an innovative approach of capturing the three-dimensional head rotation through the usual web camera that captures the image in.
Abstract: Various results have been proposed in the past decades to capture the user‘s head motions through a camera to control the navigation of the mouse pointer to enable the people with disability in the movement to interact with computers. Movement of the facial feature is tracked to estimate the movement of the mouse cursor in the computer screen. Synchronizing the rate of movement of the head with the mouse cursor movement is identified as the challenge as the head movement is three dimensional but the sequence of images captured by the web camera is two dimensional. The proposed system is an innovative approach of capturing the three-dimensional head rotation through the usual web camera that captures the image in

1 citations


Additional excerpts

  • ...[32] have attempted tracking the eye gazes....

    [...]

Proceedings ArticleDOI
05 Jul 2019
TL;DR: Sixth sense technology, where you don't need to touch the screen or hold the mouse but can control the cursor by moving fingers in the atmosphere, using a webcam as hardware and MATLAB as software.
Abstract: Sixth sense can be defined as a wearable gestural interface that augments our physical world with the digital information and also lets us communicate with this information using natural hand gestures. Without any mechanical means humans can interface with the machine (HMI) and interact with them. As we know today, human- machine interaction is very important aspect for user friendliness. Most of the gadgets today use touch screen technology, but this is too often nowadays. [3] Till now everybody is using mouse to operate computers, but surveys have found that using mouse continuously can cause various health issues like carpel tunnel syndromes, repeated motion injuries etc. Also due to the short wire the area of reach is limited. People with crippled hands or fingers are not able to use computers as they cannot hold the mouse or control a touch pad. Considering these problems we put forth the idea of “FINGER CURSOR: SIXTH SENSE TECHNOLOGY”, where you don't need to touch the screen or hold the mouse but can control the cursor by moving fingers in the atmosphere. Here we use a webcam as hardware and MATLAB as software. We implement our project using RGB (RED, GREEN, BLUE) caps. The computer interprets these caps and we can move the cursor accordingly. We capture real time videos which are converted into frames processed by extracting its RGB components, gray scale, binary and further evaluating its centroid. The mouse movements are carried out using the Robotic Class of Java. We also add features like screenshot and right click operation.

1 citations

References
More filters
Journal ArticleDOI
TL;DR: The first working prototype of a brain-controlled wheelchair that can navigate inside a typical office or hospital environment and uses a P300 EEG signal and a motion guidance strategy to navigate in a building safely and efficiently without complex sensors or sensor processing is built.
Abstract: Amyotrophic lateral sclerosis, or ALS, is a degenerative disease of the motor neurons that eventually leads to complete paralysis. We are developing a wheelchair system that can help ALS patients, and others who can't use physical interfaces such as joysticks or gaze tracking, regain some autonomy. The system must be usable in hospitals and homes with minimal infrastructure modification. It must be safe and relatively low cost and must provide optimal interaction between the user and the wheelchair within the constraints of the brain-computer interface. To this end, we have built the first working prototype of a brain-controlled wheelchair that can navigate inside a typical office or hospital environment. This article describes the BCW, our control strategy, and the system's performance in a typical building environment. This brain-controlled wheelchair prototype uses a P300 EEG signal and a motion guidance strategy to navigate in a building safely and efficiently without complex sensors or sensor processing

265 citations


"Eye-controlled mouse cursor for phy..." refers background in this paper

  • ...Many researchers have tried to develop methods to help the disabled to interact with computers by using signals such as electroencephalography (EEG) from the brain, facial muscles signals (EMG) and electro-oculogram (EOG) [1-3]....

    [...]

Proceedings ArticleDOI
01 Jan 1993
TL;DR: The authors describe an inexpensive eye movement controlled user interface for 2-D and 3-D interaction based on electro-oculography (EOG) rather than the very expensive reflectance based methods.
Abstract: The authors describe an inexpensive eye movement controlled user interface for 2-D and 3-D interaction. It is based on electro-oculography (EOG) rather than the very expensive reflectance based methods. The authors have built the hardware and software to demonstrate the viability of EOG for human-computer communication. The experiments indicate that EOG provides the basis for an adequate input interaction device. Being very inexpensive, the system is applicable for many virtual reality systems and video games as well as for the handicapped. >

124 citations


"Eye-controlled mouse cursor for phy..." refers methods in this paper

  • ...Other methods include limbus, pupil and eye/eyelid tracking [4-5], contact lens method, corneal, pupil reflection relationship [6] and head movement measurement [7]....

    [...]

Journal ArticleDOI
TL;DR: Using an eye tracker, whose data cannot be corrupted by any electrophysiological signals, an accurate method for correction is developed and it is shown that this method is consistently superior to the other three methods, often by a large margin.
Abstract: We present a new method to correct eye movement artifacts in electroencephalogram (EEG) data. By using an eye tracker, whose data cannot be corrupted by any electrophysiological signals, an accurate method for correction is developed. The eye-tracker data is used in a Kalman filter to estimate which part of the EEG is of ocular origin. The main assumptions for optimal correction are summed and their validity is proven. The eye-tracker-based correction method is objectively evaluated on simulated data of four different types of eye movements and visually evaluated on experimental data. Results are compared to three established correction methods: regression, principal component analysis, and second-order blind identification. A comparison of signal to noise ratio after correction by these methods is given in Table II and shows that our method is consistently superior to the other three methods, often by a large margin. The use of a reference signal without electrophysiological influences, as provided by an eye tracker, is essential to achieve optimal eye movement artifact removal.

46 citations


"Eye-controlled mouse cursor for phy..." refers background in this paper

  • ...Many researchers have tried to develop methods to help the disabled to interact with computers by using signals such as electroencephalography (EEG) from the brain, facial muscles signals (EMG) and electro-oculogram (EOG) [1-3]....

    [...]

Proceedings ArticleDOI
15 Sep 1997
TL;DR: A real-time camera-based system designed for gaze tracking focused on human-computer communication using a CCD camera placed between the keyboard and the screen that detects the user presence, locates his position and then tracks his face, nose and both eyes.
Abstract: We present a real-time camera-based system designed for gaze tracking focused on human-computer communication. We aim to equip computer systems with a tool which can provide visual information about the user. This tool must satisfy interaction constraints and not intrusive. Therefore, we use a CCD camera placed between the keyboard and the screen. The system detects the user presence, locates his position and then tracks his face, nose and both eyes. The detection is performed by combining image processing techniques and pattern recognition methods.

20 citations

Proceedings ArticleDOI
30 Aug 1992
TL;DR: A near-real-time computer system to track the eyes and the nose of a subject and compute the direction of the face from three natural feature points extracted from the face is presented.
Abstract: Computer operation via face orientation provides a means of communication for people with severe physical disability. The authors present a near-real-time computer system to track the eyes and the nose of a subject and compute the direction of the face. As presented by K. Ohmura et al. (1988), the direction can be computed from three natural feature points extracted from the face. The computational approach taken in this system is motivated by the practical requirements of near-real-time performance and accuracy. The approach is based on the reflection characteristic of the cornea. Any bright light source forms a twinkle in the eye that is fixed as long as the light source is fixed. Face direction detection is treated as a 2D tracking problem. The eye twinkles are extracted from the image and used to locate the nose. A final experiment shows that the face direction obtained from the three feature points is precise enough to control the mouse cursor to make selection from a menu with large light buttons. >

18 citations


"Eye-controlled mouse cursor for phy..." refers background in this paper

  • ...Other high end techniques [8] that are based on infrared tracking of the eye movements to control computers were exceptionally expensive and were not affordable for those who need them....

    [...]