Facial feedback affects valence judgments of dynamic and static emotional expressions
Sylwia Hyniewska,Wataru Sato +1 more
Reads0
Chats0
TLDR
M measuring emotion judgments in terms of valence and arousal dimensions while comparing dynamic vs. static presentations of facial expressions suggests that facial feedback mechanisms contribute to the judgment of the valence of emotional facial expressions.Abstract:
The ability to judge others' emotions is required for the establishment and maintenance of smooth interactions in a community. Several lines of evidence suggest that the attribution of meaning to a face is influenced by the facial actions produced by an observer during the observation of a face. However, empirical studies testing causal relationships between observers' facial actions and emotion judgments have reported mixed findings. This issue was investigated by measuring emotion judgments in terms of valence and arousal dimensions while comparing dynamic vs. static presentations of facial expressions. We presented pictures and videos of facial expressions of anger and happiness. Participants (N = 36) were asked to differentiate between the gender of faces by activating the corrugator supercilii muscle (brow lowering) and zygomaticus major muscle (cheek raising). They were also asked to evaluate the internal states of the stimuli using the affect grid while maintaining the facial action until they finished responding. The cheek raising condition increased the attributed valence scores compared with the brow-lowering condition. This effect of facial actions was observed for static as well as for dynamic facial expressions. These data suggest that facial feedback mechanisms contribute to the judgment of the valence of emotional facial expressions.read more
Citations
More filters
Journal ArticleDOI
Fashioning the Face: Sensorimotor Simulation Contributes to Facial Expression Recognition
TL;DR: This work integrates recent evidence in favor of a role for sensorimotor simulation in emotion recognition and connects this account to a domain-general understanding of how sensory information from multiple modalities is integrated to generate perceptual predictions in the brain.
Journal ArticleDOI
Social Cognition through the Lens of Cognitive and Clinical Neuroscience.
TL;DR: The evidence summarized here strongly suggests that the development of remediation procedures for social cognitive skills will represent a future field of translational research in clinical neuroscience.
Journal ArticleDOI
From face to face: the contribution of facial mimicry to cognitive and emotional empathy.
TL;DR: The view that mimicry occurs depending on the social context as a tool to affiliate and it is involved in cognitive as well as emotional empathy is supported.
Proceedings ArticleDOI
Can you tell the robot by the voice?: an exploratory study on the role of voice in the perception of robots
Conor McGinn,Ilaria Torre +1 more
TL;DR: It is suggested that voice design should be considered more thoroughly when planning spoken human-robot interactions, because people associate voices with robot pictures, even when the content of spoken utterances was unintelligible.
Journal ArticleDOI
Sensorimotor simulation and emotion processing: Impairing facial action increases semantic retrieval demands
TL;DR: The selective impact of facial motor interference on the brain response to lower face expressions supports sensorimotor models of emotion understanding.
References
More filters
Journal ArticleDOI
An argument for basic emotions
TL;DR: This work has shown that not only the intensity of an emotion but also its direction may vary greatly both in the amygdala and in the brain during the course of emotion regulation.
Journal ArticleDOI
Core Affect and the Psychological Construction of Emotion
TL;DR: At the heart of emotion, mood, and any other emotionally charged event are states experienced as simply feeling good or bad, energized or enervated, which influence reflexes, perception, cognition, and behavior.
Journal ArticleDOI
Imitation of facial and manual gestures by human neonates.
TL;DR: Infants between 12 and 21 days of age can imitate both facial and manual gestures; this behavior cannot be explained in terms of either conditioning or innate releasing mechanisms.
Journal ArticleDOI
Affect grid : A single-item scale of pleasure and arousal
TL;DR: The Affect Grid as mentioned in this paper is a single-item scale, designed as a quick means of assessing affect along the dimensions of pleasure-displeasure and arousal-sleepiness, which is suitable for any study that requires judgments about affect of either a descriptive or a subjective kind.
Journal ArticleDOI
Unconscious Facial Reactions to Emotional Facial Expressions
TL;DR: The results show that both positive and negative emotional reactions can be unconsciously evoked, and particularly that important aspects of emotional face-to-face communication can occur on an unconscious level.