scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

A Natural Hand Gesture System for Intelligent Human-Computer Interaction and Medical Assistance

06 Nov 2012-pp 382-385
TL;DR: The proposed intelligent HCI system can run in real-time and offers a natural and efficient interface for people with disability in their limbs to communicate with robots.
Abstract: This paper presents a novel hand gesture system for intelligent human-computer interaction (HCI) and its applications in medical assistance, e.g. intelligent wheelchair control. The hand gesture vocabulary in the system consists of five key hand postures and three compound states, and its design strategy covers the minimal hand motions, distraction detection and user-friendly design. The experiment results show that the designed lexicon is intuitive, ergonomic, and easy to be remembered and performed. The system is tested in both of the indoor and outdoor environments and shows the robustness to lighting change and users' errors. The proposed intelligent HCI system can run in real-time and offers a natural and efficient interface for people with disability in their limbs to communicate with robots.
Citations
More filters
Journal ArticleDOI
TL;DR: A review of the literature on hand gesture techniques and introduces their merits and limitations under different circumstances, and tabulates the performance of these methods, focusing on computer vision techniques that deal with the similarity and difference points.
Abstract: Hand gestures are a form of nonverbal communication that can be used in several fields such as communication between deaf-mute people, robot control, human–computer interaction (HCI), home automation and medical applications. Research papers based on hand gestures have adopted many different techniques, including those based on instrumented sensor technology and computer vision. In other words, the hand sign can be classified under many headings, such as posture and gesture, as well as dynamic and static, or a hybrid of the two. This paper focuses on a review of the literature on hand gesture techniques and introduces their merits and limitations under different circumstances. In addition, it tabulates the performance of these methods, focusing on computer vision techniques that deal with the similarity and difference points, technique of hand segmentation used, classification algorithms and drawbacks, number and types of gestures, dataset used, detection range (distance) and type of camera used. This paper is a thorough general overview of hand gesture methods with a brief discussion of some possible applications.

232 citations


Cites background or methods from "A Natural Hand Gesture System for I..."

  • ...[43] web camera 320 × 240 pixels red channel threshold segmentation method hand postures combine information from multiple cures of the motion, color and shape 100% 5 hand postures HCI wheelchair control – –...

    [...]

  • ...[43] presented a hand gesture method to assist wheelchair users indoors and outdoors using red channel thresholding with a fixed background to overcome the illumination change....

    [...]

  • ...In addition, hand gestures can be used for assistive purpose such as wheelchair control [43]....

    [...]

Journal ArticleDOI
TL;DR: An original "task oriented" way to categorize the state of the art of the AT works has been introduced that relies on the split of the final assistive goals into tasks that are then used as pointers to the works in literature in which each of them has been used as a component.

183 citations

Journal ArticleDOI
TL;DR: This work presents a novel real-time method for hand gesture recognition in which the hand region is extracted from the background with the background subtraction method and the palm and fingers are segmented so as to detect and recognize the fingers.
Abstract: Hand gesture recognition is very significant for human-computer interaction. In this work, we present a novel real-time method for hand gesture recognition. In our framework, the hand region is extracted from the background with the background subtraction method. Then, the palm and fingers are segmented so as to detect and recognize the fingers. Finally, a rule classifier is applied to predict the labels of hand gestures. The experiments on the data set of 1300 images show that our method performs well and is highly efficient. Moreover, our method shows better performance than a state-of-art method on another data set of hand gestures.

99 citations


Cites background from "A Natural Hand Gesture System for I..."

  • ...[20] improve themedical service through the hand gesture recognition....

    [...]

  • ...Hand gesture recognition has great value in many applications such as sign language recognition [12–15], augmented reality (virtual reality) [16–19], sign language interpreters for the disabled [20], and robot control [21, 22]....

    [...]

Journal ArticleDOI
TL;DR: E demonstrado experimentalmente that a arquitetura proposta supera o estado da arte com reconhecimento de gestos em tempo real, sendo robusta em diferentes representacoes e escalas da imagem.

72 citations

Proceedings ArticleDOI
16 Apr 2015
TL;DR: An intelligent wheelchair using smart phone is develop to control the rotation of wheel chair based upon voice and gesture movement for the physically challenged persons.
Abstract: Wheelchairs are used by the people who cannot walk due to physical illness, injury or other disability. Now a days development promises a wide scope in developing smart wheelchair. This paper is to describe an intelligent wheelchair using smart phone is develop to control the rotation of wheel chair based upon voice and gesture movement for the physically challenged persons. In build voice and gesture function are used to control the wheelchair as well as by using smart phone reading SMS, E-mail, News. The sensor used are 8 in which 2 of them are IR sensors the remaining are for temperature, smoke detection, light detection sensors. This system that allows the user to robustly interact with the wheelchair at different levels of the control and sensing. The system is divided into 3 main units Voice recognition through Android, Gesture recognition through Android, Motor control through signal conditioning. The system is based on grouping an android phone with a AVR micro-controller and sensors.

34 citations

References
More filters
Journal ArticleDOI
TL;DR: Body posture and finger pointing are a natural modality for human-machine interaction, but first the system must know what it's seeing.
Abstract: Body posture and finger pointing are a natural modality for human-machine interaction, but first the system must know what it's seeing.

641 citations

Journal ArticleDOI
TL;DR: The proposed mean shift embedded particle filter (MSEPF) improves the sampling efficiency considerably and produces reliable tracking while effectively handling rapid motion and distraction with roughly 85% fewer particles.

256 citations

Journal ArticleDOI
TL;DR: In this paper, the problem of manipulation with delay is considered, and an experiment to determine the effect of delay on completion time for a simple manipulative task is reported, and it appears that under certain circumstances the time required with delay can be predictd from performance measures which are independent of delay.
Abstract: The nature of remote manipulation is briefly discussed and several manipulator devices are described. The problem of manipulation with delay is considered, and an experiment to determine the effect of delay on completion time for a simple manipulative task is reported. It appears that under certain circumstances the time required with delay can be predictd from performance measures which are independent of delay.

246 citations

Journal ArticleDOI
TL;DR: This paper presents a novel hands‐free control system for intelligent wheelchairs (IWs) based on visual recognition of head gestures that is extremely useful for the users who have restricted limb movements caused by some diseases such as Parkinson's disease and quadriplegics.
Abstract: Purpose – This paper presents a novel hands‐free control system for intelligent wheelchairs (IWs) based on visual recognition of head gestures.Design/methodology/approach – A robust head gesture‐based interface (HGI), is designed for head gesture recognition of the RoboChair user. The recognised gestures are used to generate motion control commands to the low‐level DSP motion controller so that it can control the motion of the RoboChair according to the user's intention. Adaboost face detection algorithm and Camshift object tracking algorithm are combined in our system to achieve accurate face detection, tracking and gesture recognition in real time. It is intended to be used as a human‐friendly interface for elderly and disabled people to operate our intelligent wheelchair using their head gestures rather than their hands.Findings – This is an extremely useful system for the users who have restricted limb movements caused by some diseases such as Parkinson's disease and quadriplegics.Practical implicatio...

241 citations