scispace - formally typeset
Open AccessBook ChapterDOI

Toward Natural Gesture/Speech Control of a Large Display

TLDR
A structured approach for studying patterns of multimodal language in the context of a 2D-display control and co-occurrence analysis of gesture/speech production suggests syntactic organization of gestures at the lexical level.
Abstract
In recent years because of the advances in computer vision research, free hand gestures have been explored as means of human-computer interaction (HCI). Together with improved speech processing technology it is an important step toward natural multimodal HCI. However, inclusion of nonpredefined continuous gestures into a multimodal framework is a challenging problem. In this paper, we propose a structured approach for studying patterns of multimodal language in the context of a 2D-display control. We consider systematic analysis of gestures from observable kinematical primitives to their semantics as pertinent to a linguistic structure. Proposed semantic classification of co-verbal gestures distinguishes six categories based on their spatio-temporal deixis. We discuss evolution of a computational framework for gesture and speech integration which was used to develop an interactive testbed (iMAP). The testbed enabled elicitation of adequate, non-sequential, multimodal patterns in a narrative mode of HCI. Conducted user studies illustrate significance of accounting for the temporal alignment of gesture and speech parts in semantic mapping. Furthermore, co-occurrence analysis of gesture/speech production suggests syntactic organization of gestures at the lexical level.

read more

Citations
More filters
Patent

Architecture for controlling a computer using hand gestures

TL;DR: In this article, a perceptual user interface system includes a tracking component that detects object characteristics of at least one of a plurality of objects within a scene, and tracks the respective object.
Journal ArticleDOI

Speech-gesture driven multimodal interfaces for crisis management

TL;DR: The importance of multimodal interfaces in various aspects of crisis management is established and many issues in realizing successful speech-gesture driven, dialogue-enabled interfaces for crisis management are explored.
Journal ArticleDOI

Rule-based approach to recognizing human body poses and gestures in real time

TL;DR: The main novelty of this paper is a complete description of the GDL script language, its validation on a large dataset (1,600 recorded movement sequences) and the presentation of its possible application.
Proceedings ArticleDOI

A real-time framework for natural multimodal interaction with large screen displays

TL;DR: This paper presents a framework for designing a natural multimodal human computer interaction (HCI) system and found that the system performed according to its specifications in 95% of the cases and that users showed ad-hoc proficiency, indicating natural acceptance of such systems.
Journal ArticleDOI

Rethinking gesture phases: Articulatory features of gestural movement?

Jana Bressem, +1 more
- 01 Apr 2011 - 
TL;DR: It will be shown that gesture phases show a particular distribution of the features, thus distinguishing one phase from another, and changes in the execution of phases in linear successions can be described by means of features.
References
More filters
Journal ArticleDOI

Visual interpretation of hand gestures for human-computer interaction: a review

TL;DR: A fraction of the recycle slurry is treated with sulphuric acid to convert at least some of the gypsum to calcium sulphate hemihydrate and the slurry comprising hemihYDrate is returned to contact the mixture of phosphate rock, phosphoric acid and recycle Gypsum slurry.
Proceedings ArticleDOI

“Put-that-there”: Voice and gesture at the graphics interface

TL;DR: The work described herein involves the user commanding simple shapes about a large-screen graphics display surface, and because voice can be augmented with simultaneous pointing, the free usage of pronouns becomes possible, with a corresponding gain in naturalness and economy of expression.
Book

Gaze and mutual gaze

TL;DR: The role of gaze in human social interaction was investigated experimentally by Argyle and Cook as mentioned in this paper, who set up a research group at Oxford with Ted Crossman and Adam Kendon, to study non-verbal communication and gaze as an important aspect of this behaviour.
Book ChapterDOI

How Language Structures Space

TL;DR: This chapter is concerned with the structure that is ascribed to space and the objects within it by linguistic “fine structure,” that subdivision of language which provides a fundamental conceptual framework.
Book ChapterDOI

Hand and Mind

David McNeill
Related Papers (5)