scispace - formally typeset
Book ChapterDOI

Gesture Modelling and Recognition by Integrating Declarative Models and Pattern Recognition Algorithms

Reads0
Chats0
TLDR
DEICTIC is introduced, a compositional and declarative gesture description model which uses basic Hidden Markov Models (HMMs) to recognize meaningful pre-defined primitives (gesture sub-parts), and uses a composition of basic HMMs to recognize complex gestures.
Abstract
Gesture recognition approaches based on computer vision and machine learning mainly focus on recognition accuracy and robustness. Research on user interface development focuses instead on the orthogonal problem of providing guidance for performing and discovering interactive gestures, through compositional approaches that provide information on gesture sub-parts. We make a first step toward combining the advantages of both approaches. We introduce DEICTIC, a compositional and declarative gesture description model which uses basic Hidden Markov Models (HMMs) to recognize meaningful pre-defined primitives (gesture sub-parts), and uses a composition of basic HMMs to recognize complex gestures. Preliminary empirical results show that DEICTIC exhibits a similar recognition performance as “monolithic” HMMs used in state-of-the-art vision-based approaches, retaining at the same time the advantages of declarative approaches.

read more

Citations
More filters
Journal ArticleDOI

DEICTIC: A compositional and declarative gesture description based on hidden markov models

TL;DR: DEICTIC, a compositional and declarative description for stroke gestures, which uses basic Hidden Markov Models to recognise meaningful predefined primitives (gesture sub-parts) and it composes them to recognise complex gestures and reaches an accuracy comparable with state-of-the-art approaches.
Journal ArticleDOI

DG3: Exploiting Gesture Declarative Models for Sample Generation and Online Recognition

TL;DR: DG3, an end-to-end method for exploiting gesture interaction in user interfaces, is introduced and it is shown that the method outperforms existing approaches for online recognition and has comparable accuracy with offline methods after a few gesture segments.

A Declarative and Classifier Gesture Recognition Method for Creating an Effective Feedback and Feedforward System.

TL;DR: The main goal of this Ph.D. is finding a way for filling the gap between Machine Learning and declarative and compositional approaches, bridging the gap in recognition of gestures in an interactive application.
Proceedings ArticleDOI

Integrating declarative models and HMMs for online gesture recognition

TL;DR: A work in progress research for connecting the algorithm used for accurately recognizing the user movements and the guidance provided to users while executing gestures and increasing their effectiveness is discussed.
References
More filters
Journal ArticleDOI

The viterbi algorithm

TL;DR: This paper gives a tutorial exposition of the Viterbi algorithm and of how it is implemented and analyzed, and increasing use of the algorithm in a widening variety of areas is foreseen.
Journal ArticleDOI

Visual interpretation of hand gestures for human-computer interaction: a review

TL;DR: A fraction of the recycle slurry is treated with sulphuric acid to convert at least some of the gypsum to calcium sulphate hemihydrate and the slurry comprising hemihYDrate is returned to contact the mixture of phosphate rock, phosphoric acid and recycle Gypsum slurry.
Journal ArticleDOI

Gesture Recognition: A Survey

TL;DR: A survey on gesture recognition with particular emphasis on hand gestures and facial expressions is provided, and applications involving hidden Markov models, particle filtering and condensation, finite-state machines, optical flow, skin color, and connectionist models are discussed in detail.
Journal ArticleDOI

Vision based hand gesture recognition for human computer interaction: a survey

TL;DR: An analysis of comparative surveys done in the field of gesture based HCI and an analysis of existing literature related to gesture recognition systems for human computer interaction by categorizing it under different key parameters are provided.
Journal ArticleDOI

Natural user interfaces are not natural

TL;DR: I believe the authors will look back on 2010 as the year they expanded beyond the mouse and keyboard and started incorporating more natural forms of interaction such as touch, speech, gestures, handwriting, and vision— what computer scientists call the 'NUI' or natural user interface.
Related Papers (5)