scispace - formally typeset
Journal ArticleDOI

Automatic Emotion Recognition Based on Body Movement Analysis: A Survey

Reads0
Chats0
TLDR
This survey describes emerging techniques and modalities related to emotion recognition based on body movement as well as recent advances in automatic emotion recognition and describes application areas and notation systems and explains the importance of movement segmentation.
Abstract
Humans are emotional beings, and their feelings influence how they perform and interact with computers One of the most expressive modalities for humans is body posture and movement, which researchers have recently started exploiting for emotion recognition This survey describes emerging techniques and modalities related to emotion recognition based on body movement, as well as recent advances in automatic emotion recognition It also describes application areas and notation systems and explains the importance of movement segmentation It then discusses unsolved problems and provides promising directions for future research The Web extra (a PDF file) contains tables with additional information related to the article

read more

Citations
More filters
Journal ArticleDOI

Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review

TL;DR: The emotion recognition methods based on multi-channel EEG signals as well as multi-modal physiological signals are reviewed and the correlation between different brain areas and emotions is discussed.
Journal ArticleDOI

Social touch in human–computer interaction

TL;DR: It is argued that ICT mediated or generated social touch can (a) intensify the perceived social presence of remote communication partners and (b) enable computer systems to more effectively convey affective information.
Book ChapterDOI

Survey on AI-Based Multimodal Methods for Emotion Detection

TL;DR: This paper investigates the possibility of automated emotion representation, recognition and prediction its state-of-the-art and main directions for further research, focusing on the impact of emotion analysis and state of the arts of multimodal emotion detection.
Journal ArticleDOI

Emotion Recognition From Body Movement

TL;DR: A novel two-layer feature selection framework for emotion classification from a comprehensive list of body movement features is introduced, which achieved a very high emotion recognition rate outperforming all of the state-of-the-art methods.
Journal ArticleDOI

Human–Robot Facial Expression Reciprocal Interaction Platform: Case Studies on Children with Autism

TL;DR: A robotic platform has been developed for reciprocal interaction consisting of two main phases, namely as Non-structured and Structured interaction modes, and the effect and acceptability of the platform have been investigated on autistic children between 3 and 7 years old.
References
More filters
Journal ArticleDOI

Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament

TL;DR: In this article, evidence relating the PAD Temperament Model to 59 individual difference measures was reviewed and formulas were offered for use of P, A and D temperament scores to compute and predict a variety of personality scores (e.g., anxiety, depression, panic, Somatization, Empathy, Affiliation, Achievement, Extroversion, Arousal Seeking, Loneliness, Neuroticism, Suicide Proneness, Binge Eating, Substance Abuse, Emotional Stability, Dependency, Aggressiveness, and Fidgeting).
Journal ArticleDOI

A survey of vision-based methods for action representation, segmentation and recognition

TL;DR: This survey focuses on approaches that aim on classification of full-body motions, such as kicking, punching, and waving, and categorizes them according to how they represent the spatial and temporal structure of actions.
Dissertation

Visual Recognition of American Sign Language Using Hidden Markov Models.

Thad Starner
TL;DR: Using hidden Markov models (HMM's), an unobstrusive single view camera system is developed that can recognize hand gestures, namely, a subset of American Sign Language (ASL), achieving high recognition rates for full sentence ASL using only visual cues.
Journal ArticleDOI

Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence

TL;DR: In this article, a total of 176 computer-generated mannequin figures were produced from descriptions of postural expressions of emotion in order to investigate the attribution of emotion to static body postures.
Related Papers (5)