scispace - formally typeset
Search or ask a question

Showing papers on "Facial expression published in 2008"


Journal ArticleDOI
05 Nov 2008
TL;DR: A new corpus named the “interactive emotional dyadic motion capture database” (IEMOCAP), collected by the Speech Analysis and Interpretation Laboratory at the University of Southern California (USC), which provides detailed information about their facial expressions and hand movements during scripted and spontaneous spoken communication scenarios.
Abstract: Since emotions are expressed through a combination of verbal and non-verbal channels, a joint analysis of speech and gestures is required to understand expressive human communication. To facilitate such investigations, this paper describes a new corpus named the “interactive emotional dyadic motion capture database” (IEMOCAP), collected by the Speech Analysis and Interpretation Laboratory (SAIL) at the University of Southern California (USC). This database was recorded from ten actors in dyadic sessions with markers on the face, head, and hands, which provide detailed information about their facial expressions and hand movements during scripted and spontaneous spoken communication scenarios. The actors performed selected emotional scripts and also improvised hypothetical scenarios designed to elicit specific types of emotions (happiness, anger, sadness, frustration and neutral state). The corpus contains approximately 12 h of data. The detailed motion capture information, the interactive setting to elicit authentic emotions, and the size of the database make this corpus a valuable addition to the existing databases in the community for the study and modeling of multimodal and expressive human communication.

2,359 citations


Book ChapterDOI
23 Dec 2008
TL;DR: A new 3D face database that includes a rich set of expressions, systematic variation of poses and different types of occlusions is presented, which can be a very valuable resource for development and evaluation of algorithms on face recognition under adverse conditions and facial expression analysis as well as for facial expression synthesis.
Abstract: A new 3D face database that includes a rich set of expressions, systematic variation of poses and different types of occlusions is presented in this paper. This database is unique from three aspects: i) the facial expressions are composed of judiciously selected subset of Action Units as well as the six basic emotions, and many actors/actresses are incorporated to obtain more realistic expression data; ii) a rich set of head pose variations are available; and iii) different types of face occlusions are included. Hence, this new database can be a very valuable resource for development and evaluation of algorithms on face recognition under adverse conditions and facial expression analysis as well as for facial expression synthesis.

819 citations


Journal ArticleDOI
TL;DR: The results confirm that the amygdala responds to both positive and negative stimuli, with a preference for faces depicting emotional expressions, and provide strong support for a functional dissociation between left and right amygdala in terms of temporal dynamics.

798 citations


Journal ArticleDOI
TL;DR: Results show a robust link between antisocial behavior and specific deficits in recognizing fearful expressions and suggest dysfunction among antisocial individuals in specified neural substrates, namely the amygdala, involved in processing fearful facial affect.

769 citations


Journal ArticleDOI
TL;DR: These findings support the contention that callous and unemotional personality traits are associated with reduced amygdala response to distress-based social cues.
Abstract: Objective: Extensive work implicates abnormal amygdala activation in emotional facial expression processing in adults with callous-unemotional traits. However, no research has examined amygdala response to emotional facial expressions in adolescents with disruptive behavior and callous-unemotional traits. Moreover, despite high comorbidity of callous-unemotional traits and attention deficit hyperactivity disorder (ADHD), no research has attempted to distinguish neural correlates of pediatric callous-unemotional traits and ADHD. Method: Participants were 36 children and adolescents (ages 10–17 years); 12 had callous-unemotional traits and either conduct disorder or oppositional defiant disorder, 12 had ADHD, and 12 were healthy comparison subjects. Functional MRI was used to assess amygdala activation patterns during processing of fearful facial expressions. Patterns in the callous-unemotional traits group were compared with those in the ADHD and comparison groups. Results: In youths with callous-unemotion...

750 citations


Journal ArticleDOI
TL;DR: In this paper, a validation study of 490 pictures of human facial expressions from the Karolinska Directed Emotional Faces database (KDEF) was conducted, where pictures were evaluated on emotional content and were rated on an intensity and arousal scale.
Abstract: Although affective facial pictures are widely used in emotion research, standardised affective stimuli sets are rather scarce, and the existing sets have several limitations. We therefore conducted a validation study of 490 pictures of human facial expressions from the Karolinska Directed Emotional Faces database (KDEF). Pictures were evaluated on emotional content and were rated on an intensity and arousal scale. Results indicate that the database contains a valid set of affective facial pictures. Hit rates, intensity, and arousal of the 20 best KDEF pictures for each basic emotion are provided in an appendix.

667 citations


Proceedings ArticleDOI
Lijun Yin1, Xiaochen Chen1, Yi Sun1, T. Worm1, Michael Reale1 
01 Sep 2008
TL;DR: This paper presents a newly created high-resolution 3D dynamic facial expression database, which is made available to the scientific research community and has been validated through the authors' facial expression recognition experiment using an HMM based 3D spatio-temporal facial descriptor.
Abstract: Face information processing relies on the quality of data resource From the data modality point of view, a face database can be 2D or 3D, and static or dynamic From the task point of view, the data can be used for research of computer based automatic face recognition, face expression recognition, face detection, or cognitive and psychological investigation With the advancement of 3D imaging technologies, 3D dynamic facial sequences (called 4D data) have been used for face information analysis In this paper, we focus on the modality of 3D dynamic data for the task of facial expression recognition We present a newly created high-resolution 3D dynamic facial expression database, which is made available to the scientific research community The database contains 606 3D facial expression sequences captured from 101 subjects of various ethnic backgrounds The database has been validated through our facial expression recognition experiment using an HMM based 3D spatio-temporal facial descriptor It is expected that such a database shall be used to facilitate the facial expression analysis from a static 3D space to a dynamic 3D space, with a goal of scrutinizing facial behavior at a higher level of detail in a real 3D spatio-temporal domain

537 citations


Journal ArticleDOI
TL;DR: By monitoring eye movements, it is demonstrated that characteristic fixation patterns previously thought to be determined solely by the facial expression are systematically modulated by emotional context already at very early stages of visual processing, even by the first time the face is fixated.
Abstract: Current theories of emotion perception posit that basic facial expressions signal categorically discrete emotions or affective dimensions of valence and arousal. In both cases, the information is thought to be directly "read out" from the face in a way that is largely immune to context. In contrast, the three studies reported here demonstrated that identical facial configurations convey strikingly different emotions and dimensional values depending on the affective context in which they are embedded. This effect is modulated by the similarity between the target facial expression and the facial expression typically associated with the context. Moreover, by monitoring eye movements, we demonstrated that characteristic fixation patterns previously thought to be determined solely by the facial expression are systematically modulated by emotional context already at very early stages of visual processing, even by the first time the face is fixated. Our results indicate that the perception of basic facial expressions is not context invariant and can be categorically altered by context at early perceptual levels.

523 citations


Journal ArticleDOI
TL;DR: Convergent results provide support for the Darwinian hypothesis that facial expressions are not arbitrary configurations for social communication, but rather, expressions may have originated in altering the sensory interface with the physical world.
Abstract: It has been proposed that facial expression production originates in sensory regulation. Here we demonstrate that facial expressions of fear are configured to enhance sensory acquisition. A statistical model of expression appearance revealed that fear and disgust expressions have opposite shape and surface reflectance features. We hypothesized that this reflects a fundamental antagonism serving to augment versus diminish sensory exposure. In keeping with this hypothesis, when subjects posed expressions of fear, they had a subjectively larger visual field, faster eye movements during target localization and an increase in nasal volume and air velocity during inspiration. The opposite pattern was found for disgust. Fear may therefore work to enhance perception, whereas disgust dampens it. These convergent results provide support for the Darwinian hypothesis that facial expressions are not arbitrary configurations for social communication, but rather, expressions may have originated in altering the sensory interface with the physical world.

470 citations


Journal ArticleDOI
TL;DR: Debates about some features of this processing now suggest that while the amygdala can process fearful facial expressions in the absence of conscious perception, and while there is some degree of preattentive processing, this depends on the context and is not necessarily more rapid than cortical processing routes.

437 citations


Journal ArticleDOI
TL;DR: In general, happy faces were identified more accurately, earlier, and faster than other faces, whereas judgments of fearful faces were the least accurate, the latest, and the slowest.
Abstract: Participants judged which of seven facial expressions (neutrality, happiness, anger, sadness, surprise, fear, and disgust) were displayed by a set of 280 faces corresponding to 20 female and 20 male models of the Karolinska Directed Emotional Faces database (Lundqvist, Flykt, & Ohman, 1998). Each face was presented under free-viewing conditions (to 63 participants) and also for 25, SO, 100, 250, and 500 msec (to 160 participants), to examine identification thresholds. Measures of identification accuracy, types of errors, and reaction times were obtained for each expression. In general, happy faces were identified more accurately, earlier, and faster than other faces, whereas judgments of fearful faces were the least accurate, the latest, and the slowest. Norms for each face and expression regarding level of identification accuracy, errors, and reaction times may be downloaded from www.psychonomic.org/archive/.

Journal ArticleDOI
TL;DR: The findings suggest that the sexually dimorphic facial width-to-height ratio may be an ‘honest signal’ of propensity for aggressive behaviour.
Abstract: Facial characteristics are an important basis for judgements about gender, emotion, personality, motivational states and behavioural dispositions. Based on a recent finding of a sexual dimorphism in facial metrics that is independent of body size, we conducted three studies to examine the extent to which individual differences in the facial width-to-height ratio were associated with trait dominance (using a questionnaire) and aggression during a behavioural task and in a naturalistic setting (varsity and professional ice hockey). In study 1, men had a larger facial width-to-height ratio, higher scores of trait dominance, and were more reactively aggressive compared with women. Individual differences in the facial width-to-height ratio predicted reactive aggression in men, but not in women (predicted 15% of variance). In studies 2 (male varsity hockey players) and 3 (male professional hockey players), individual differences in the facial width-to-height ratio were positively related to aggressive behaviour as measured by the number of penalty minutes per game obtained over a season (predicted 29 and 9% of the variance, respectively). Together, these findings suggest that the sexually dimorphic facial width-to-height ratio may be an ‘honest signal’ of propensity for aggressive behaviour.

Journal ArticleDOI
TL;DR: Results suggest that the level of facial mimicry varies as a function of group membership, and mimicry levels were influenced by the kind of emotion displayed by the expresser.

Journal ArticleDOI
TL;DR: It is concluded that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection of emotional facial expressions.
Abstract: In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

Journal ArticleDOI
15 Oct 2008-Pain
TL;DR: The findings support the concept of a core pain expression with desirable psychometric properties and are consistent with the suggestion of individual differences in pain expressiveness.
Abstract: The present study examined psychometric properties of facial expressions of pain. A diverse sample of 129 people suffering from shoulder pain underwent a battery of active and passive range-of-motion tests to their affected and unaffected limbs. The same tests were repeated on a second occasion. Participants rated the maximum pain induced by each test on three self-report scales. Facial actions were measured with the Facial Action Coding System. Several facial actions discriminated painful from non-painful movements; however, brow-lowering, orbit tightening, levator contraction and eye closing appeared to constitute a distinct, unitary action. An index of pain expression based on these actions demonstrated test-retest reliability and concurrent validity with self-reports of pain. The findings support the concept of a core pain expression with desirable psychometric properties. They are also consistent with the suggestion of individual differences in pain expressiveness. Reasons for varying reports of relations between pain expression and self-reports in previous studies are discussed.

Journal ArticleDOI
TL;DR: Although untrained observers performed only slightly above chance at detecting deception, inconsistent emotional leakage occurred in 100% of participants at least once and lasted longer than the current definition of a microexpression suggests.
Abstract: The widespread supposition that aspects of facial communication are uncontrollable and can betray a deceiver's true emotion has received little empirical attention. We examined the presence of inconsistent emotional expressions and "microexpressions" (1/25-1/5 of a second) in genuine and deceptive facial expressions. Participants viewed disgusting, sad, frightening, happy, and neutral images, responding to each with a genuine or deceptive (simulated, neutralized, or masked) expression. Each 1/30-s frame (104,550 frames in 697 expressions) was analyzed for the presence and duration of universal expressions, microexpressions, and blink rate. Relative to genuine emotions, masked emotions were associated with more inconsistent expressions and an elevated blink rate; neutralized emotions showed a decreased blink rate. Negative emotions were more difficult to falsify than happiness. Although untrained observers performed only slightly above chance at detecting deception, inconsistent emotional leakage occurred in 100% of participants at least once and lasted longer than the current definition of a microexpression suggests. Microexpressions were exhibited by 21.95% of participants in 2% of all expressions, and in the upper or lower face only.

Journal ArticleDOI
TL;DR: In this article, the authors found that dopamine-associated reward-processing regions of the brain were activated when mothers viewed their own 5- to 10-month-old infants face compared with an unknown infant's face.
Abstract: OBJECTIVES. Our goal was to determine how a mother9s brain responds to her own infant9s facial expressions, comparing happy, neutral, and sad face affect. METHODS. In an event-related functional MRI study, 28 first-time mothers were shown novel face images of their own 5- to 10-month-old infant and a matched unknown infant. Sixty unique stimuli from 6 categories (own-happy, own-neutral, own-sad, unknown-happy, unknown-neutral, and unknown-sad) were presented randomly for 2 seconds each, with a variable 2- to 6-second interstimulus interval. RESULTS. Key dopamine-associated reward-processing regions of the brain were activated when mothers viewed their own infant9s face compared with an unknown infant9s face. These included the ventral tegmental area/substantia nigra regions, the striatum, and frontal lobe regions involved in (1) emotion processing (medial prefrontal, anterior cingulate, and insula cortex), (2) cognition (dorsolateral prefrontal cortex), and (3) motor/behavioral outputs (primary motor area). Happy, but not neutral or sad own-infant faces, activated nigrostriatal brain regions interconnected by dopaminergic neurons, including the substantia nigra and dorsal putamen. A region-of-interest analysis revealed that activation in these regions was related to positive infant affect (happy > neutral > sad) for each own–unknown infant-face contrast. CONCLUSIONS. When first-time mothers see their own infant9s face, an extensive brain network seems to be activated, wherein affective and cognitive information may be integrated and directed toward motor/behavioral outputs. Dopaminergic reward-related brain regions are activated specifically in response to happy, but not sad, infant faces. Understanding how a mother responds uniquely to her own infant, when smiling or crying, may be the first step in understanding the neural basis of mother–infant attachment.

Journal ArticleDOI
TL;DR: There is a need for more work examining developmental changes in the amygdala's response to fearful faces and in amygdala functional connectivity during face processing, according to prior research on amygdala function and development.
Abstract: Several lines of evidence implicate the amygdala in face-emotion processing, particularly for fearful facial expressions. Related findings suggest that face-emotion processing engages the amygdala within an interconnected circuitry that can be studied using a functional-connectivity approach. Past work also underscores important functional changes in the amygdala during development. Taken together, prior research on amygdala function and development reveals a need for more work examining developmental changes in the amygdala's response to fearful faces and in amygdala functional connectivity during face processing. The present study used event-related functional magnetic resonance imaging to compare 31 adolescents (9--17 years old) and 30 adults (21--40 years old) on activation to fearful faces in the amygdala and other regions implicated in face processing. Moreover, these data were used to compare patterns of amygdala functional connectivity in adolescents and adults. During passive viewing, adolescents demonstrated greater amygdala and fusiform activation to fearful faces than did adults. Functional connectivity analysis revealed stronger connectivity between the amygdala and the hippocampus in adults than in adolescents. Within each group, variability in age did not correlate with amygdala response, and sex-related developmental differences in amygdala response were not found. Eye movement data collected outside of the magnetic resonance imaging scanner using the same task suggested that developmental differences in amygdala activation were not attributable to differences in eye-gaze patterns. Amygdala hyperactivation in response to fearful faces may explain increased vulnerability to affective disorders in adolescence; stronger amygdala-hippocampus connectivity in adults than adolescents may reflect maturation in learning or habituation to facial expressions.

Journal ArticleDOI
TL;DR: In this paper, the authors targeted the right occipital face area (rOFA) and the face region of right somatosensory cortex (rSC) with repetitive transcranial magnetic stimulation (rTMS) while participants discriminated facial expressions.
Abstract: Theories of embodied cognition propose that recognizing facial expressions requires visual processing followed by simulation of the somatovisceral responses associated with the perceived expression. To test this proposal, we targeted the right occipital face area (rOFA) and the face region of right somatosensory cortex (rSC) with repetitive transcranial magnetic stimulation (rTMS) while participants discriminated facial expressions. rTMS selectively impaired discrimination of facial expressions at both sites but had no effect on a matched face identity task. Site specificity within the rSC was demonstrated by targeting rTMS at the face and finger regions while participants performed the expression discrimination task. rTMS targeted at the face region impaired task performance relative to rTMS targeted at the finger region. To establish the temporal course of visual and somatosensory contributions to expression processing, double-pulse TMS was delivered at different times to rOFA and rSC during expression discrimination. Accuracy dropped when pulses were delivered at 60-100 ms at rOFA and at 100-140 and 130-170 ms at rSC. These sequential impairments at rOFA and rSC support embodied accounts of expression recognition as well as hierarchical models of face processing. The results also demonstrate that nonvisual cortical areas contribute during early stages of expression processing.

Journal ArticleDOI
TL;DR: Significant classification of patients in an acute depressive episode was achieved with whole brain pattern analysis of fMRI data, and the prediction of treatment response showed a trend toward significance due to the reduced power of the subsample.

Journal ArticleDOI
TL;DR: Data provide further support for elevated amygdala activity in depression and suggest that anterior cingulate activity may be a predictor of treatment response to both pharmacotherapy and CBT.

Journal ArticleDOI
TL;DR: It is argued that face evaluation is an extension of functionally adaptive systems for understanding the communicative meaning of emotional expressions and predicts a nonlinear response in the amygdala to face trustworthiness, confirmed in functional magnetic resonance imaging studies, and dissociations between processing of facial identity and face evaluation, confirm in studies with developmental prosopagnosics.
Abstract: People routinely make various trait judgments from facial appearance, and such judgments affect important social outcomes. These judgments are highly correlated with each other, reflecting the fact that valence evaluation permeates trait judgments from faces. Trustworthiness judgments best approximate this evaluation, consistent with evidence about the involvement of the amygdala in the implicit evaluation of face trustworthiness. Based on computer modeling and behavioral experiments, I argue that face evaluation is an extension of functionally adaptive systems for understanding the communicative meaning of emotional expressions. Specifically, in the absence of diagnostic emotional cues, trustworthiness judgments are an attempt to infer behavioral intentions signaling approach/avoidance behaviors. Correspondingly, these judgments are derived from facial features that resemble emotional expressions signaling such behaviors: happiness and anger for the positive and negative ends of the trustworthiness continuum, respectively. The emotion overgeneralization hypothesis can explain highly efficient but not necessarily accurate trait judgments from faces, a pattern that appears puzzling from an evolutionary point of view and also generates novel predictions about brain responses to faces. Specifically, this hypothesis predicts a nonlinear response in the amygdala to face trustworthiness, confirmed in functional magnetic resonance imaging (fMRI) studies, and dissociations between processing of facial identity and face evaluation, confirmed in studies with developmental prosopagnosics. I conclude with some methodological implications for the study of face evaluation, focusing on the advantages of formally modeling representation of faces on social dimensions.

Journal ArticleDOI
TL;DR: The authors argue that the evidence is consistent with claims that (a) preattentive search processes are sensitive to and influenced by facial expressions of emotion, (b) attention guidance is influenced by a dynamic interplay of emotional and perceptual factors, and (c) visual search for emotional faces is influence by the emotional state of the observer to some extent.
Abstract: The goal of this review is to critically examine contradictory findings in the study of visual search for emotionally expressive faces. Several key issues are addressed: Can emotional faces be processed preattentively and guide attention? What properties of these faces influence search efficiency? Is search moderated by the emotional state of the observer? The authors argue that the evidence is consistent with claims that (a) preattentive search processes are sensitive to and influenced by facial expressions of emotion, (b) attention guidance is influenced by a dynamic interplay of emotional and perceptual factors, and (c) visual search for emotional faces is influenced by the emotional state of the observer to some extent. The authors also argue that the way in which contextual factors interact to determine search performance needs to be explored further to draw sound conclusions about the precise influence of emotional expressions on search efficiency. Methodological considerations (e.g., set size, distractor background, task set) and ecological limitations of the visual search task are discussed. Finally, specific recommendations are made for future research directions.

Journal ArticleDOI
TL;DR: Attention to other people's eyes is reduced in young people with high psychopathic traits, thus accounting for their problems with fear recognition, and is consistent with amygdala dysfunction failing to promote attention to emotional salience in the environment.
Abstract: Objective Damage to the amygdala produces deficits in the ability to recognize fear due to attentional neglect of other people's eyes. Interestingly, children with high psychopathic traits also show problems recognizing fear; however, the reasons for this are not known. This study tested whether psychopathic traits are associated with reduced attention to the eye region of other people's faces. Method Adolescent males ( N = 100; age mean 12.4 years, SD 2.2) were stratified by psychopathic traits and assessed using a Tobii eye tracker to measure primacy, number, and duration of fixations to the eye and mouth regions of emotional faces presented via the UNSW Facial Emotion Task. Results High psychopathic traits predicted poor fear recognition (1.21 versus 1.35; p p p p r = .50; p Conclusions Attention to other people's eyes is reduced in young people with high psychopathic traits, thus accounting for their problems with fear recognition, and is consistent with amygdala dysfunction failing to promote attention to emotional salience in the environment.

Journal ArticleDOI
TL;DR: Oxytocin has distinct effects on memory performance for facial identity and may contribute to the modulation of social behaviour.

Journal ArticleDOI
TL;DR: Investigating whether BPD patients are more sensitive but less accurate in terms of basic emotion recognition, and show a bias towards perceiving anger and fear when evaluating ambiguous facial expressions found that they are accurate in perceiving facial emotions, and are probably more sensitive to familiar facial expressions.
Abstract: Patients with Borderline Personality Disorder (BPD) have been described as emotionally hyperresponsive, especially to anger and fear in social contexts. The aim was to investigate whether BPD patients are more sensitive but less accurate in terms of basic emotion recognition, and show a bias towards perceiving anger and fear when evaluating ambiguous facial expressions. Twenty-five women with BPD were compared with healthy controls on two different facial emotion recognition tasks. The first task allowed the assessment of the subjective detection threshold as well as the number of evaluation errors on six basic emotions. The second task assessed a response bias to blends of basic emotions. BPD patients showed no general deficit on the affect recognition task, but did show enhanced learning over the course of the experiment. For ambiguous emotional stimuli, we found a bias towards the perception of anger in the BPD patients but not towards fear. BPD patients are accurate in perceiving facial emotions, and are probably more sensitive to familiar facial expressions. They show a bias towards perceiving anger, when socio-affective cues are ambiguous. Interpersonal training should focus on the differentiation of ambiguous emotion in order to reduce a biased appraisal of others.

Journal ArticleDOI
01 Mar 2008-Pain
TL;DR: This study strongly supports the claim that the facial expression of pain is distinct from the expression of basic emotions and provides unique material to explore the psychological and neurobiological processes underlying the perception of pain expression, its impact on the observer, and its role in the regulation of social behaviour.
Abstract: Facial expressions of pain and emotions provide powerful social signals, which impart information about a person's state. Unfortunately, research on pain and emotion expression has been conducted largely in parallel with few bridges allowing for direct comparison of the expressive displays and their impact on observers. Moreover, although facial expressions are highly dynamic, previous research has relied mainly on static photographs. Here we directly compare the recognition and discrimination of dynamic facial expressions of pain and basic emotions by naive observers. One-second film clips were recorded in eight actors displaying neutral facial expressions and expressions of pain and the basic emotions of anger, disgust, fear, happiness, sadness and surprise. Results based on the Facial Action Coding System (FACS) confirmed the distinct (and prototypical) configuration of pain and basic emotion expressions reported in previous studies. Volunteers' evaluations of those dynamic expressions on intensity, arousal and valence demonstrate the high sensitivity and specificity of the observers' judgement. Additional rating data further suggest that, for comparable expression intensity, pain is perceived as more arousing and more unpleasant. This study strongly supports the claim that the facial expression of pain is distinct from the expression of basic emotions. This set of dynamic facial expressions provides unique material to explore the psychological and neurobiological processes underlying the perception of pain expression, its impact on the observer, and its role in the regulation of social behaviour.

Journal ArticleDOI
06 Aug 2008-PLOS ONE
TL;DR: FMRI results demonstrate that brain responses to face expressions are not driven by facial features alone but determined by the personal significance of expressions in current social context, and provide new support to psychological models that have postulated two separate affective dimensions to explain these individual differences.
Abstract: Adult attachment style refers to individual personality traits that strongly influence emotional bonds and reactions to social partners. Behavioral research has shown that adult attachment style reflects profound differences in sensitivity to social signals of support or conflict, but the neural substrates underlying such differences remain unsettled. Using functional magnetic resonance imaging (fMRI), we examined how the three classic prototypes of attachment style (secure, avoidant, anxious) modulate brain responses to facial expressions conveying either positive or negative feedback about task performance (either supportive or hostile) in a social game context. Activation of striatum and ventral tegmental area was enhanced to positive feedback signaled by a smiling face, but this was reduced in participants with avoidant attachment, indicating relative impassiveness to social reward. Conversely, a left amygdala response was evoked by angry faces associated with negative feedback, and correlated positively with anxious attachment, suggesting an increased sensitivity to social punishment. Secure attachment showed mirror effects in striatum and amygdala, but no other specific correlate. These results reveal a critical role for brain systems implicated in reward and threat processing in the biological underpinnings of adult attachment style, and provide new support to psychological models that have postulated two separate affective dimensions to explain these individual differences, centered on the ventral striatum and amygdala circuits, respectively. These findings also demonstrate that brain responses to face expressions are not driven by facial features alone but determined by the personal significance of expressions in current social context. By linking fundamental psychosocial dimensions of adult attachment with brain function, our results do not only corroborate their biological bases but also help understand their impact on behavior.

Journal ArticleDOI
TL;DR: Results were largely consistent with expectations in that psychopathy was negatively correlated with overall facial recognition of affect, sad facial affect, and recognition of less intense displays of affect and an unexpected negative correlation with recognition of happy facial affect was found.

Journal ArticleDOI
TL;DR: Results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing, and emotion effects were found for the N170.
Abstract: In daily life, we perceive a person's facial reaction as part of the natural environment surrounding it. Because most studies have investigated how facial expressions are recognized by using isolated faces, it is unclear what role the context plays. Although it has been observed that the N170 for facial expressions is modulated by the emotional context, it was not clear whether individuals use context information on this stage of processing to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made. Emotion effects were found for the N170, with larger amplitudes for faces in fearful scenes as compared to faces in happy and neutral scenes. Critically, N170 amplitudes were significantly increased for fearful faces in fearful scenes as compared to fearful faces in happy scenes and expressed in left-occipito-temporal scalp topography differences. Our results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing.