Book ChapterDOI
Collection and Annotation of a Corpus of Human-Human Multimodal Interactions: Emotion and Others Anthropomorphic Characteristics
Aurélie Zara,Valérie Maffiolo,Jean-Claude Martin,Laurence Devillers +3 more
- pp 464-475
Reads0
Chats0
TLDR
The EmoTaboo protocol is presented for the collection of multimodal emotional behaviours occurring during human-human interactions in a game context and a new annotation methodology based on a hierarchical taxonomy of emotion-related words is introduced.Abstract:
In order to design affective interactive systems, experimental grounding is required for studying expressions of emotion during interaction. In this paper, we present the EmoTaboo protocol for the collection of multimodal emotional behaviours occurring during human-human interactions in a game context. First annotations revealed that the collected data contains various multimodal expressions of emotions and other mental states. In order to reduce the influence of language via a predetermined set of labels and to take into account differences between coders in their capacity to verbalize their perception, we introduce a new annotation methodology based on 1) a hierarchical taxonomy of emotion-related words, and 2) the design of the annotation interface. Future directions include the implementation of such an annotation tool and its evaluation for the annotation of multimodal interactive and emotional behaviours. We will also extend our first annotation scheme to several other characteristics interdependent of emotions.read more
Citations
More filters
Journal ArticleDOI
IEMOCAP: interactive emotional dyadic motion capture database
Carlos Busso,Murtaza Bulut,Chi-Chun Lee,Abe Kazemzadeh,Emily Mower,Samuel Kim,Jeannette N. Chang,Sungbok Lee,Shrikanth S. Narayanan +8 more
TL;DR: A new corpus named the “interactive emotional dyadic motion capture database” (IEMOCAP), collected by the Speech Analysis and Interpretation Laboratory at the University of Southern California (USC), which provides detailed information about their facial expressions and hand movements during scripted and spontaneous spoken communication scenarios.
Book ChapterDOI
The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data
Ellen Douglas-Cowie,Roddy Cowie,Ian Sneddon,Cate Cox,Orla Lowry,Margaret McRorie,Jean-Claude Martin,Laurence Devillers,Sarkis Abrilian,Anton Batliner,Noam Amir,Kostas Karpouzis +11 more
TL;DR: The HUMAINE Database provides naturalistic clips which record that kind of material, in multiple modalities, and labelling techniques that are suited to describing it.
Journal ArticleDOI
A 3-D Audio-Visual Corpus of Affective Communication
TL;DR: This work presents a new audio-visual corpus for possibly the two most important modalities used by humans to communicate their emotional states, namely speech and facial expression in the form of dense dynamic 3-D face geometries.
Journal ArticleDOI
The MPI emotional body expressions database for narrative scenarios.
TL;DR: A new database consisting of a large set of natural emotional body expressions typical of monologues, featuring the intended emotion expression for each motion sequence from the actor is presented and made available in a searchable MPI Emotional Body Expression Database.
Journal ArticleDOI
Analyses of a Multimodal Spontaneous Facial Expression Database
TL;DR: These analyses of a multimodal spontaneous facial expression database of natural visible and infrared facial expressions demonstrate the effectiveness of the emotion-inducing experimental design, the gender difference in emotional responses, and the coexistence of multiple emotions/expressions.
References
More filters
Book
Affective Computing
TL;DR: Key issues in affective computing, " computing that relates to, arises from, or influences emotions", are presented and new applications are presented for computer-assisted learning, perceptual information retrieval, arts and entertainment, and human health and interaction.
Journal ArticleDOI
Emotion recognition in human-computer interaction
Roddy Cowie,Ellen Douglas-Cowie,Nicolas Tsapatsoulis,George N. Votsis,Stefanos Kollias,W. Fellenz,John G. Taylor +6 more
TL;DR: Basic issues in signal processing and analysis techniques for consolidating psychological and linguistic analyses of emotion are examined, motivated by the PKYSTA project, which aims to develop a hybrid system capable of using information from faces and voices to recognize people's emotions.
Book
Handbook of affective sciences.
TL;DR: In this article, the authors bring together, for the first time, the various strands of inquiry and latest research in the scientific study of the relationship between the mechanisms of the brain and the psychology of the mind.
Book
Embodied conversational agents
TL;DR: Embodied conversational agents as mentioned in this paper are computer-generated cartoonlike characters that demonstrate many of the same properties as humans in face-to-face conversation, including the ability to produce and respond to verbal and nonverbal communication.
Related Papers (5)
Emotional speech: towards a new generation of databases
Using Actor Portrayals to Systematically Study Multimodal Emotion Expression: The GEMEP Corpus
Tanja Bänziger,Klaus R. Scherer +1 more