scispace - formally typeset
Open accessJournal ArticleDOI: 10.1037/EMO0000835

The role of movement kinematics in facial emotion expression production and recognition.

04 Mar 2021-Emotion (American Psychological Association)-Vol. 21, Iss: 5, pp 1041-1061
Abstract: The kinematics of peoples' body movements provide useful cues about emotional states: for example, angry movements are typically fast and sad movements slow. Unlike the body movement literature, studies of facial expressions have focused on spatial, rather than kinematic, cues. This series of experiments demonstrates that speed comprises an important facial emotion expression cue. In Experiments 1a-1c we developed (N = 47) and validated (N = 27) an emotion-induction procedure, and recorded (N = 42) posed and spontaneous facial expressions of happy, angry, and sad emotional states. Our novel analysis pipeline quantified the speed of changes in distance between key facial landmarks. We observed that happy expressions were fastest, sad were slowest, and angry expressions were intermediate. In Experiment 2 (N = 67) we replicated our results for posed expressions and introduced a novel paradigm to index communicative emotional expressions. Across Experiments 1 and 2, we demonstrate differences between posed, spontaneous, and communicative expression contexts. Whereas mouth and eyebrow movements reliably distinguished emotions for posed and communicative expressions, only eyebrow movements were reliable for spontaneous expressions. In Experiments 3 and 4 we manipulated facial expression speed and demonstrated a quantifiable change in emotion recognition accuracy. That is, in a discovery (N = 29) and replication sample (N = 41), we showed that speeding up facial expressions promotes anger and happiness judgments, and slowing down expressions encourages sad judgments. This influence of kinematics on emotion recognition is dissociable from the influence of spatial cues. These studies demonstrate that the kinematics of facial movements provide added value, and an independent contribution to emotion recognition. (PsycInfo Database Record (c) 2021 APA, all rights reserved).

... read more

Topics: Emotional expression (63%), Facial expression (59%), Body movement (54%) ... read more
Citations
  More

6 results found


Open accessJournal ArticleDOI: 10.1007/S10803-021-05083-9
Abstract: To date, studies have not established whether autistic and non-autistic individuals differ in emotion recognition from facial motion cues when matched in terms of alexithymia. Here, autistic and non-autistic adults (N = 60) matched on age, gender, non-verbal reasoning ability and alexithymia, completed an emotion recognition task, which employed dynamic point light displays of emotional facial expressions manipulated in terms of speed and spatial exaggeration. Autistic participants exhibited significantly lower accuracy for angry, but not happy or sad, facial motion with unmanipulated speed and spatial exaggeration. Autistic, and not alexithymic, traits were predictive of accuracy for angry facial motion with unmanipulated speed and spatial exaggeration. Alexithymic traits, in contrast, were predictive of the magnitude of both correct and incorrect emotion ratings.

... read more

Topics: Alexithymia (55%), Facial expression (54%)

3 Citations


Open accessJournal ArticleDOI: 10.1016/J.COGNITION.2021.104710
22 Mar 2021-Cognition
Abstract: Recognition of emotional facial expressions is considered to be atypical in autism. This difficulty is thought to be due to the way that facial expressions are visually explored. Evidence for atypical visual exploration of emotional faces in autism is, however, equivocal. We propose that, where observed, atypical visual exploration of emotional facial expressions is due to alexithymia, a distinct but frequently co-occurring condition. In this eye-tracking study we tested the alexithymia hypothesis using a number of recent methodological advances to study eye gaze during several emotion processing tasks (emotion recognition, intensity judgements, free gaze), in 25 adults with, and 45 without, autism. A multilevel polynomial modelling strategy was used to describe the spatiotemporal dynamics of eye gaze to emotional facial expressions. Converging evidence from traditional and novel analysis methods revealed that atypical gaze to the eyes is best predicted by alexithymia in both autistic and non-autistic individuals. Information theoretic analyses also revealed differential effects of task on gaze patterns as a function of alexithymia, but not autism. These findings highlight factors underlying atypical emotion processing in autistic individuals, with wide-ranging implications for emotion research.

... read more

Topics: Alexithymia (59%), Facial expression (56%), Autism (56%) ... read more

3 Citations


Open accessJournal ArticleDOI: 10.1111/PSYP.13945
01 Jan 2022-Psychophysiology
Abstract: Using still pictures of emotional facial expressions as experimental stimuli, reduced amygdala responses or impaired recognition of basic emotions were repeatedly found in people with psychopathic traits. The amygdala also plays an important role in short-latency facial mimicry responses. Since dynamic emotional facial expressions may have higher ecological validity than still pictures, we compared short-latency facial mimicry responses to dynamic and static emotional expressions between adolescents with psychopathic traits and normal controls. Facial EMG responses to videos or still pictures of emotional expressions (happiness, anger, sadness, fear) were measured. Responses to 500-ms dynamic expressions in videos, as well as the subsequent 1500-ms phase of maximal (i.e., static) expression, were compared between male adolescents with disruptive behavior disorders and high (n = 14) or low (n = 17) callous-unemotional (CU) traits, and normal control subjects (n = 32). Responses to still pictures were also compared between groups. EMG responses to dynamic expressions were generally significantly smaller in the high-CU group than in the other two groups, which generally did not differ. These group differences gradually emerged during the 500-ms stimulus presentation period but in general they were already seen a few hundred milliseconds after stimulus onset. Group differences were absent during the 1500-ms phase of maximal expression and during exposure to still pictures. Subnormal short-latency mimicry responses to dynamic emotional facial expressions in the high-CU group might have negative consequences for understanding emotional facial expressions of others during daily life when human facial interactions are primarily dynamic.

... read more

Topics: Emotional expression (66%), Facial expression (60%), Facial electromyography (57%) ... read more

Posted ContentDOI: 10.1002/AUR.2642
30 Nov 2021-Autism Research
Abstract: Recent developments suggest that autistic individuals require dynamic angry expressions to have a higher speed in order for them to be successfully identified. Therefore, it is plausible that autistic individuals do not have a 'deficit' in angry expression recognition, but rather their internal representation of these expressions is characterised by very high-speed movement. In this study, matched groups of autistic and non-autistic adults completed a novel emotion-based task which employed dynamic displays of happy, angry and sad point light facial (PLF) expressions. On each trial, participants moved a slider to manipulate the speed of a PLF stimulus until it moved at a speed that, in their 'mind's eye', was typical of happy, angry or sad expressions. Participants were shown three different types of PLFs-those showing the full-face, only the eye region, and only the mouth region, wherein the latter two were included to test whether differences in facial information sampling underpinned any dissimilarities in speed attributions. Across both groups, participants attributed the highest speeds to angry, then happy, then sad, facial motion. Participants increased the speed of angry and happy expressions by 41% and 27% respectively and decreased the speed of sad expressions by 18%. This suggests that participants have 'caricatured' internal representations of emotion, wherein emotion-related kinematic cues are over-emphasised. There were no differences between autistic and non-autistic individuals in the speeds attributed to full-face and partial-face angry, happy and sad expressions respectively. Consequently, we find no evidence that autistic adults possess atypically fast internal representations of anger.

... read more

Topics: Facial expression (53%)

Book ChapterDOI: 10.1007/978-3-030-89906-6_14
Khurshid Ahmad1, Shirui Wang1, Shirui Wang2, Carl Vogel1  +3 moreInstitutions (2)
28 Nov 2021-
Abstract: Dealing with non-verbal communications will be a key breakthrough for future technologies as much of the effort of the 21st century technologies has been in dealing with numbers and verbal communications. The automatic recognition of facial expressions is of theoretical and commercial interests and to this end there must exist video databases that incorporate the idiosyncrasies of human existence – ethnicity, gender and age. We compare the performance of three major emotion recognition software systems on real life videos of politicians from across the world. Our sample of 45 videos (total length of 2 h 26 min, with 219150 frames) is composed of male and female politicians ranging in age from 40 to 78 with well-defined differences related to gender and nationality/ethnicity. Our sample of images are partially posed and partially spontaneous – the demeanour of politicians when they engage in speech making. Our target systems, Micorosoft Azure Cognitive Services Face API, Affectiva AFFDEX and Emotient FACET, have been trained on posed expressions usually, with limited testing on spontaneous images, so in effect we are operating at the edge of the performance of these systems. There are similarities in the performance of these systems on some emotions, especially joy, but there are differences in emotion recognition, such as anger. There are also gender differences as well as differences based on age and race. This is an important issue as more and more video data is becoming available and video analytics that can deal with aspects of cognition, like emotion, accurately and across cultural/gender/ethnic divides will be a major component of future technologies.

... read more


References
  More

20 results found


Open accessJournal ArticleDOI: 10.3758/BF03193146
Abstract: G*Power (Erdfelder, Faul, & Buchner, 1996) was designed as a general stand-alone power analysis program for statistical tests commonly used in social and behavioral research. G*Power 3 is a major extension of, and improvement over, the previous versions. It runs on widely used computer platforms (i.e., Windows XP, Windows Vista, and Mac OS X 10.4) and covers many different statistical tests of thet, F, and χ2 test families. In addition, it includes power analyses forz tests and some exact tests. G*Power 3 provides improved effect size calculators and graphic options, supports both distribution-based and design-based input modes, and offers all types of power analyses in which users might be interested. Like its predecessors, G*Power 3 is free.

... read more

Topics: Windows Vista (55%)

30,063 Citations


DatasetDOI: 10.1037/T27734-000
14 Jan 2019-

3,470 Citations


Journal ArticleDOI: 10.1080/02699939508408966
James J. Gross1, Robert W. Levenson1Institutions (1)
Abstract: Researchers interested in emotion have long struggled with the problem of how to elicit emotional responses in the laboratory. In this article, we summarise five years of work to develop a set of films that reliably elicit each of eight emotional states (amusement, anger, contentment, disgust, fear, neutral, sadness, and surprise). After evaluating over 250 films, we showed selected film clips to an ethnically diverse sample of 494 English-speaking subjects. We then chose the two best films for each of the eight target emotions based on the intensity and discreteness of subjects' responses to each film. We found that our set of 16 films successfully elicited amusement, anger, contentment. disgust, sadness, surprise, a relatively neutral state, and, to a lesser extent, fear. We compare this set of films with another set recently described by Philippot (1993), and indicate that detailed instructions for creating our set of film stimuli will be provided on request.

... read more

Topics: Emotion classification (59%), Sadness (58%), Disgust (53%) ... read more

2,196 Citations


Open accessJournal ArticleDOI: 10.1037//0022-3514.70.3.614
Rainer Banse1, Klaus R. SchererInstitutions (1)
Abstract: Professional actors' portrayals of 14 emotions varying in intensity and valence were presented to judges. The results on decoding replicate earlier findings on the ability of judges to infer vocally expressed emotions with much-better-than-chance accuracy, including consistently found differences in the recognizability of different emotions. A total of 224 portrayals were subjected to digital acoustic analysis to obtain profiles of vocal parameters for different emotions. The data suggest that vocal parameters not only index the degree of intensity typical for different emotions but also differentiate valence or quality aspects. The data are also used to test theoretical predictions on vocal patterning based on the component process model of emotion (K.R. Scherer, 1986). Although most hypotheses are supported, some need to be revised on the basis of the empirical evidence. Discriminant analysis and jackknifing show remarkably high hit rates and patterns of confusion that closely mirror those found for listener-judges.

... read more

Topics: Valence (psychology) (56%)

1,722 Citations


Journal ArticleDOI: 10.1080/02699930903274322
Abstract: Using emotional film clips is one of the most popular and effective methods of emotion elicitation. The main goal of the present study was to develop and test the effectiveness of a new and comprehensive set of emotional film excerpts. Fifty film experts were asked to remember specific film scenes that elicited fear, anger, sadness, disgust, amusement, tenderness, as well as emotionally neutral scenes. For each emotion, the 10 most frequently mentioned scenes were selected and cut into film clips. Next, 364 participants viewed the film clips in individual laboratory sessions and rated each film on multiple dimensions. Results showed that the film clips were effective with regard to several criteria such as emotional discreteness, arousal, positive and negative affect. Finally, ranking scores were computed for 24 classification criteria: Subjective arousal, positive and negative affect (derived from the PANAS; Watson & Tellegen, 1988), a positive and a negative affect scores derived from the Differential Emotions Scale (DES; Izard et al., 1974), six emotional discreteness scores (for anger, disgust, sadness, fear, amusement and tenderness), and 15 “mixed feelings” scores assessing the effectiveness of each film excerpt to produce blends of specific emotions. In addition, a number of emotionally neutral film clips were also validated. The database and editing instructions to construct the film clips have been made freely available in a website.

... read more

Topics: Sadness (54%), Valence (psychology) (54%), Disgust (51%) ... read more

625 Citations


Performance
Metrics
No. of citations received by the Paper in previous years
YearCitations
20222
20214