scispace - formally typeset
Search or ask a question

Showing papers on "Facial expression published in 1999"


Journal ArticleDOI
01 May 1999-Brain
TL;DR: Functional neuroimaging results provide evidence for dissociable, but interlocking, systems for the processing of distinct categories of negative facial expression.
Abstract: Previous neuroimaging and neuropsychological studies have investigated the neural substrates which mediate responses to fearful, disgusted and happy expressions. No previous studies have investigated the neural substrates which mediate responses to sad and angry expressions. Using functional neuroimaging, we tested two hypotheses. First, we tested whether the amygdala has a neural response to sad and/or angry facial expressions. Secondly, we tested whether the orbitofrontal cortex has a specific neural response to angry facial expressions. Volunteer subjects were scanned, using PET, while they performed a sex discrimination task involving static grey-scale images of faces expressing varying degrees of sadness and anger. We found that increasing intensity of sad facial expression was associated with enhanced activity in the left amygdala and right temporal pole. In addition, we found that increasing intensity of angry facial expression was associated with enhanced activity in the orbitofrontal and anterior cingulate cortex. We found no support for the suggestion that angry expressions generate a signal in the amygdala. The results provide evidence for dissociable, but interlocking, systems for the processing of distinct categories of negative facial expression.

1,222 citations


Journal ArticleDOI
TL;DR: The recognition of emotional facial expressions in nine subjects with bilateral amygdala damage is reported, using a sensitive and quantitative assessment, to show that the amygdala plays an important role in triggering knowledge related to threat and danger signaled by facial expressions.

736 citations


Journal ArticleDOI
TL;DR: The authors investigated whether individuals preferentially allocate attention to the spatial location of threatening faces presented outside awareness, and found that the tendency to orient attention towards masked threat faces was greater in high than low trait anxious individuals.
Abstract: Three studies investigated whether individuals preferentially allocate attention to the spatial location of threatening faces presented outside awareness. Pairs of face stimuli were briefly displayed and masked in a modified version of the dot-probe task. Each face pair consisted of an emotional (threat or happy) and neutral face. The hypothesis that preattentive processing of threat results in attention being oriented towards its location was supported in Experiments 1 and 3. In both studies, this effect was most apparent in the left visual field, suggestive of right hemisphere involvement. However, in Experiment 2 where awareness of the faces was less restricted (i.e. marginal threshold conditions), preattentive capture of attention by threat was not evident. There was evidence from Experiment 3 that the tendency to orient attention towards masked threat faces was greater in high than low trait anxious individuals.

466 citations


Journal ArticleDOI
TL;DR: Results suggested that the categorization task changes the spatial scales preferentially used and perceived for rapid recognition, and suggest that categorization can be closely bound to perception.

453 citations


Journal ArticleDOI
TL;DR: Results showed a significantly worse performance in autistic individuals than in both normal and Down subjects on both facial expression of emotion subtasks, although on the identity and emotional situation subtasks there were no significant differences between groups.
Abstract: Ten autistic individuals (mean age: 12;7 years, SD 3.8, range 5;10–16;0), 10 Down individuals (12;3 years, SD 3.0, range 7;1–16;0), and a control group of 10 children with normal development (mean age: 6;3 years, SD 1.6, range 4;0–9;4), matched for verbal mental age, were tested on a delayed-matching task and on a sorting-by-preference task. The first task required subjects to match faces on the basis of the emotion being expressed or on the basis of identity. Different from the typical simultaneous matching procedure the target picture was shortly presented (750 msec) and was not visible when the sample pictures were shown to the subject, thus reducing the possible use of perceptual, piecemeal, processing strategies based on the typical features of the emotional facial expression. In the second task, subjects were required to rate the valence of an isolated stimulus, such as facial expression of emotion or an emotional situation in which no people were represented. The aim of the second task was to compare the autistic and nonautistic children's tendency to judge pleasantness of a face using facial expression of emotion as a meaningful index. Results showed a significantly worse performance in autistic individuals than in both normal and Down subjects on both facial expression of emotion subtasks, although on the identity and emotional situation subtasks there were no significant differences between groups.

444 citations


Journal ArticleDOI
TL;DR: In this article, the authors applied computer image analysis to the problem of automatically detecting facial actions in sequences of images and compared three approaches: holistic spatial analysis, explicit measurement of features such as wrinkles, and estimation of motion flow fields.
Abstract: Facial expressions provide an important behavioral measure for the study of emotion, cognitive processes, and social interaction. The Facial Action Coding System (Ekman & Friesen, 1978) is an objective method for quantifying facial movement in terms of component actions. We applied computer image analysis to the problem of automatically detecting facial actions in sequences of images. Three approaches were compared: holistic spatial analysis, explicit measurement of features such as wrinkles, and estimation of motion flow fields. The three methods were combined in a hybrid system that classified six upper facial actions with 91% accuracy. The hybrid system outperformed human nonexperts on this task and performed as well as highly trained experts. An automated system would make facial expression measurement more widely accessible as a research tool in behavioral science and investigations of the neural substrates of emotion.

435 citations


Journal ArticleDOI
TL;DR: It is conjectured that a blindsight subject (GY) might recognize facial expressions presented in his blind field and the present study provides direct evidence for this claim.
Abstract: Functional neuroimaging experiments have shown that recognition of emotional expressions does not depend on awareness of visual stimuli and that unseen fear stimuli can activate the amygdala via a colliculopulvinar pathway. Perception of emotional expressions in the absence of awareness in normal subjects has some similarities with the unconscious recognition of visual stimuli which is well documented in patients with striate cortex lesions (blindsight). Presumably in these patients residual vision engages alternative extra-striate routes such as the superior colliculus and pulvinar. Against this background, we conjectured that a blindsight subject (GY) might recognize facial expressions presented in his blind field. The present study now provides direct evidence for this claim.

419 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigated the relation between nonverbal decoding skills and relationship well-being and found that errors in decoding facial expressions and tones of voice were associated with less relationship wellbeing and greater depression.
Abstract: The purpose of the present study was to investigate the relation between nonverbal decoding skills and relationship well-being. Sixty college students were administered tests of their abilities to identify the affective meanings in facial expressions and tones of voice. The students also completed self-report measures of relationship well-being and depression. Correlational analyses indicated that errors in decoding facial expressions and tones of voice were associated with less relationship well-being and greater depression. Hierarchical regression revealed that nonverbal decoding accuracy was significantly related to relationship well-being even after controlling for depression.

360 citations


Journal ArticleDOI
TL;DR: The first evidence for a distinction between two schizophrenic patient subgroups on the basis of recognition of and neural response to different negative facial expressions is provided.
Abstract: Several studies have demonstrated impaired facial expression recognition in schizophrenia. Few have examined the neural basis for this; none have compared the neural correlates of facial expression perception in different schizophrenic patient subgroups. We compared neural responses to facial expressions in 10 right-handed schizophrenic patients (five paranoid and five non-paranoid) and five normal volunteers using functional Magnetic Resonance Imaging (fMRI). In three 5-min experiments, subjects viewed alternating 30-s blocks of black-and-white facial expressions of either fear, anger or disgust contrasted with expressions of mild happiness. After scanning, subjects categorised each expression. All patients were less accurate in identifying expressions, and showed less activation to these stimuli than normals. Non-paranoids performed poorly in the identification task and failed to activate neural regions that are normally linked with perception of these stimuli. They categorised disgust as either anger or fear more frequently than paranoids, and demonstrated in response to disgust expressions activation in the amygdala, a region associated with perception of fearful faces. Paranoids were more accurate in recognising expressions, and demonstrated greater activation than non-paranoids to most stimuli. We provide the first evidence for a distinction between two schizophrenic patient subgroups on the basis of recognition of and neural response to different negative facial expressions.

315 citations


Journal ArticleDOI
TL;DR: This paper used multiple methods to examine two questions about emotion and culture: (1) which facial expressions are recognised cross-culturally; and (2) does the forced-choice method lead to spurious findings of universality?
Abstract: We used multiple methods to examine two questions about emotion and culture: (1) Which facial expressions are recognised cross-culturally; and (2) does the “forced-choice” method lead to spurious findings of universality? Forty participants in the US and 40 in India were shown 14 facial expressions and asked to say what had happened to cause the person to make the face. Analyses of the social situations given and of the affect words spontaneously used showed high levels of recognition for most of the expressions. A subsequent forced-choice task using the same faces confirmed these findings. Analysis of the pattern of magnitude, discreteness, and similarity of responses across cultures and expressions led to the conclusion that there is no neat distinction between cross-culturally recognisable and nonrecognisable expressions. Results are better described as a gradient of recognition.

309 citations


Journal ArticleDOI
TL;DR: Analysis of recognition of facial expressions of emotion among women diagnosed with borderline personality disorder indicated that borderline individuals were primarily accurate perceivers of others' emotions and showed a tendency toward heightened sensitivity on recognition of fear, specifically.
Abstract: This study examined recognition of facial expressions of emotion among women diagnosed with borderline personality disorder (BPD; n = 21), compared to a group of women with histories of childhood sexual abuse with no current or prior diagnosis of BPD (n = 21) and a group of women with no history of sexual abuse or BPD (n = 20). Facial recognition was assessed by a slide set developed by Ekman and Matsumoto (Japanese and Caucasian Facial Expressions of Emotion and Neutral Faces, 1992), expanded and improved from previous slide sets, and utilized a coding system that allowed for free responses rather than the more typical fixed-response format. Results indicated that borderline individuals were primarily accurate perceivers of others' emotions and showed a tendency toward heightened sensitivity on recognition of fear, specifically. Results are discussed in terms of emotional appraisal ability and emotion dysregulation among individuals with BPD.

Journal ArticleDOI
TL;DR: An automated method of facial display analysis by feature point tracking demonstrated high concurrent validity with manual FACS coding.
Abstract: The face is a rich source of information about human behavior. Available methods for coding facial displays, however, are human-observer dependent, labor intensive, and difficult to standardize. To enable rigorous and efficient quantitative measurement of facial displays, we have developed an automated method of facial display analysis. In this report, we compare the results with this automated system with those of manual FACS (Facial Action Coding System, Ekman & Friesen, 1978a) coding. One hundred university students were videotaped while performing a series of facial displays. The image sequences were coded from videotape by certified FACS coders. Fifteen action units and action unit combinations that occurred a minimum of 25 times were selected for automated analysis. Facial features were automatically tracked in digitized image sequences using a hierarchical algorithm for estimating optical flow. The measurements were normalized for variation in position, orientation, and scale. The image sequences were randomly divided into a training set and a cross-validation set, and discriminant function analyses were conducted on the feature point measurements. In the training set, average agreement with manual FACS coding was 92% or higher for action units in the brow, eye, and mouth regions. In the cross-validation set, average agreement was 91%, 88%, and 81% for action units in the brow, eye, and mouth regions, respectively. Automated face analysis by feature point tracking demonstrated high concurrent validity with manual FACS coding.

Journal ArticleDOI
TL;DR: The results suggest that the right inferior frontal cortex processes emotional communicative signals that could be visual or auditory and that there is a hemispheric asymmetry in the superior frontal cortex in relation to the processing of emotional Communicative signals.
Abstract: We measured regional cerebral blood flow (rCBF) using positron emission tomography (PET) to determine which brain regions are involved in the assessment of facial emotion. We asked right-handed nor...


Journal ArticleDOI
TL;DR: The findings suggest that the amygdala plays a critical role in knowledge concerning the arousal of negative emotions, a function that may explain the impaired recognition of fear and anger in patients with bilateral amygdala damage, and one that is consistent with the amygdala's role in processing stimuli related to threat and danger.
Abstract: Functional neuroimaging and lesion-based neuropsycho- logical experiments have demonstrated the human amygdala's role in recognition of certain emotions signaled by sensory stimuli, notably, fear and anger in facial expressions. We examined recognition of two emotional dimensions, arousal and valence, in a rare subject with complete, bilateral damage restricted to the amygdala. Recognition of emotional arousal was impaired for facial expressions, words, and sentences that depicted unpleasant emotions, especially in regard to fear and anger. However, recognition of emotional valence was nor- mal. The findings suggest that the amygdala plays a critical role in knowledge concerning the arousal of negative emotions, a function that may explain the impaired recognition of fear and anger in patients with bilateral amygdala damage, and one that is consistent with the amygdala's role in processing stimuli related to threat and danger. Studies in humans provide strong evidence for neural systems that are specialized for the recognition of certain emotions. Some of the clearest evidence comes from studies of patients with damage to the amygdala, a brain structure long thought to play an important role in emotion. Bilateral amygdala damage disproportionately impairs the recognition of unpleasant emotions, especially fear, in facial expres- sions (Adolphs, Tranel, Damasio, & Damasio, 1994, 1995; Broks et amygdala may be critical to process a class of emotions that are high- ly arousing and related to threat and danger. We hypothesized that the human brain contains neural systems spe- cialized to recognize emotional arousal in negatively valenced stimuli, and that the amygdala is one key component of such systems. We test- ed this hypothesis by asking a rare subject with complete, selective bilateral amygdala damage to rate emotional stimuli explicitly with respect to the two attributes of arousal and valence.

Journal ArticleDOI
TL;DR: The data are consistent with previous work in healthy adult subjects implicating the amygdala as essential for the recognition of fearful facial expression.
Abstract: Objective To examine further the role of the amygdala in the recognition of facial expression in adolescents. Method Twelve healthy adolescents were studied using functional magnetic resonance imaging technology during a task of facial affect recognition and a visual control task. Results All subjects demonstrated a significant increase in signal intensity in the amygdala for the facial expression recognition task. Conclusions The data are consistent with previous work in healthy adult subjects implicating the amygdala as essential for the recognition of fearful facial expression.

Journal ArticleDOI
TL;DR: It is found that faces evoked different MEG responses as a function of task demands, i.e., the activations recorded during facial emotion recognition were different from those recorded during simple face recognition in the control task.

Journal ArticleDOI
TL;DR: Findings show a specific deficit compromising the recognition of the emotion of fear from a wide range of social signals, and suggest a possible relationship of this type of impairment with alterations of emotional experience.
Abstract: People with brain injuries involving the amygdala are often poor at recognizing facial expressions of fear, but the extent to which this impairment compromises other signals of the emotion of fear has not been clearly established. We investigated N.M., a person with bilateral amygdala damage and a left thalamic lesion, who was impaired at recognizing fear from facial expressions. N.M. showed an equivalent deficit affecting fear recognition from body postures and emotional sounds. His deficit of fear recognition was not linked to evidence of any problem in recognizing anger (a common feature in other reports), but for his everyday experience of emotion N.M. reported reduced anger and fear compared with neurologically normal controls. These findings show a specific deficit compromising the recognition of the emotion of fear from a wide range of social signals, and suggest a possible relationship of this type of impairment with alterations of emotional experience.

Journal ArticleDOI
TL;DR: In this paper, Lipps et al. found that individuals mimic emotional facial expressions, and that the decoding of facial expressions is accompanied by shared affect, but no evidence that emotion recognition accuracy or shared affect are mediated by mimi- cry was found.
Abstract: Lipps (1907) presented a model of empathy which had an important influence on later formulations. According to Lipps, individuals tend to mimic an interaction partner's behavior, and this nonverbal mimicry induces—via a feedback process—the corresponding affective state in the observer. The resulting shared af- fect is believed to foster the understanding of the observed person's self. The present study tested this model in the context of judgments of emotional facial expressions. The results confirm that individuals mimic emotional facial expressions, and that the decoding of facial expressions is accompanied by shared affect. However, no evidence that emotion recognition accuracy or shared affect are mediated by mimi- cry was found. Yet, voluntary mimicry was found to have some limited influence on observer' s assessment of the observed person's personality. The implications of these results with regard to Lipps' original hypothesis are discussed. The communication of emotions and thoughts is an important aspect of everyday social interactions. Specifically, our ability to understand the emotional states as well as the interpersonal intent of our interaction part- ners influences the quality of our social interactions. The process underly- ing the understanding of another's emotional and cognitive point of view is called "empathy." In its original usage empathy referred to the tendency of observers to project themselves "into" another person in order to know the other person. This notion was first expressed by Lipps (1907) who believed that empathy is mediated by the imitation (mimicry) of other's behavior.

Journal ArticleDOI
TL;DR: Analysis of emotional facial expression decoding in alcoholics indicates that alcoholics overestimate the intensity of emotional expressions and make more errors in their decoding with a special bias for anger and contempt.
Abstract: The present study investigated emotional facial expression decoding in alcoholics. Twenty-five alcoholic patients at the end of the detoxification process were compared with 25 volunteers matched for age, sex, and education. They were presented with facial expressions of neutral, mild, moderate, or strong emotional intensity. Results indicate that alcoholics overestimate the intensity of emotional expressions and make more errors in their decoding with a special bias for anger and contempt. Moreover, this decoding deficit is not perceived by the alcoholic patients. A general model is proposed that links visuospatial deficits, abnormal processing of social information, interpersonal stress, and alcohol abuse.

Journal ArticleDOI
TL;DR: The human amygdala's role in recognizing emotion in prosody may not be as critical as it is for facial expressions, and that extra-amygdalar structures in right hemisphere may be more important for recognizing emotional prosody.

Journal ArticleDOI
Zhengyou Zhang1
TL;DR: Experiments show that facial expression recognition is mainly a low frequency process, and a spatial resolution of 64 pixels × 64 pixels is probably enough to represent the space of facial expressions.
Abstract: In this paper, we report our experiments on feature-based facial expression recognition within an architecture based on a two-layer perceptron. We investigate the use of two types of features extracted from face images: the geometric positions of a set of fiducial points on a face, and a set of multiscale and multiorientation Gabor wavelet coefficients at these points. They can be used either independently or jointly. The recognition performance with different types of features has been compared, which shows that Gabor wavelet coefficients are much more powerful than geometric positions. Furthermore, since the first layer of the perceptron actually performs a nonlinear reduction of the dimensionality of the feature space, we have also studied the desired number of hidden units, i.e. the appropriate dimension to represent a facial expression in order to achieve a good recognition rate. It turns out that five to seven hidden units are probably enough to represent the space of facial expressions. Then, we have investigated the importance of each individual fiducial point to facial expression recognition. Sensitivity analysis reveals that points on cheeks and on forehead carry little useful information. After discarding them, not only the computational efficiency increases, but also the generalization performance slightly improves. Finally, we have studied the significance of image scales. Experiments show that facial expression recognition is mainly a low frequency process, and a spatial resolution of 64 pixels × 64 pixels is probably enough.

Journal ArticleDOI
TL;DR: Investigation of the role of movement in the recognition of facial expressions of emotion indicated that individuals with mental retardation were significantly poorer at identifying anger, fear, disgust, and surprise.
Abstract: Moving and static videotaped and photographic displays of posed emotional expressions were presented to 12 adults with mental retardation and 12 without mental retardation to investigate the role of movement in the recognition of facial expressions of emotion Participants chose the corresponding emotion portrayed by the displays from among six written and pictorial labels of the emotions Results indicated that individuals with mental retardation were significantly poorer at identifying anger, fear, disgust, and surprise Both groups performed significantly better on the moving as opposed to the static videotaped displays of the emotions sad and angry Visual-perceptual limitations are likely contributors to the poorer performance of the group with mental retardation in recognizing moving and static facial expressions of emotion

Proceedings ArticleDOI
26 May 1999
TL;DR: The approach to combining a model of emotions with a facial model represents a first step towards developing the technology of a truly believable interactive agent which has a wide range of applications from designing intelligent training systems to video games and animation tools.
Abstract: The ability to express emotions is important for creating believable interactive characters. To simulate emotional expressions in an interactive environment, an intelligent agent needs both an adaptive model for generating believable responses, and a visualization model for mapping emotions into facial expressions. Recent advances in intelligent agents and in facial modeling have produced effective algorithms for these tasks independently. We describe a method for integrating these algorithms to create an interactive simulation of an agent that produces appropriate facial expressions in a dynamic environment. Our approach to combining a model of emotions with a facial model represents a first step towards developing the technology of a truly believable interactive agent which has a wide range of applications from designing intelligent training systems to video games and animation tools.

Journal ArticleDOI
TL;DR: The results indicate that information in the upper part of the talker's face is more critical for intonation pattern decisions than for decisions about word segments or primary sentence stress, thus supporting the Gaze Direction Assumption.
Abstract: Two experiments were conducted to test the hypothesis that visual information related to segmental versus prosodic aspects of speech is distributed differently on the face of the talker. In the fir...

Journal ArticleDOI
TL;DR: Diazepam selectively impaired subjects’ ability to recognise angry expressions but did not affect recognition of any other emotional expression, providing further support for the suggestion that there are dissociable systems responsible for processing emotional expressions.
Abstract: Rationale: Facial expressions appear to be processed by at least partially separable neuro-cognitive systems. Given this functional specialisation of expression processing, it is plausible that these neurocognitive systems may also be dissociable pharmacologically. Objective: The present study therefore compared the effects of diazepam (15 mg) with placebo upon the ability to recognise emotional expressions. Methods: A double blind, independent group design was used to compare the effects of diazepam and matched placebo in 32 healthy volunteers. Participants were presented morphed facial expression stimuli following a paradigm developed for use with patients with brain damage and asked to name one of the six basic emotions (sadness, happiness, anger, disgust, fear and surprise). Results: Diazepam selectively impaired subjects' ability to recognise angry expressions but did not affect recognition of any other emotional expression. Conclusions: The findings are interpreted as providing further support for the suggestion that there are dissociable systems responsible for processing emotional expressions. It is suggested that these findings may have implications for understanding paradoxical aggression sometimes elicited by benzodiazepines.

Journal ArticleDOI
TL;DR: Evidence of the achievable performance is reported at the end of this paper by means of figures showing the capability of the system to reshape its geometry according to the decoded MPEG-4 facial calibration parameters and its effectiveness in performing facial expressions.
Abstract: We propose a method for implementing a high-level interface for the synthesis and animation of animated virtual faces that is in full compliance with MPEG-4 specifications. This method allows us to implement the simple facial object profile and part of the calibration facial object profile. In fact, starting from a facial wireframe and from a set of configuration files, the developed system is capable of automatically generating the animation rules suited for model animation driven by a stream of facial animation parameters. If the calibration parameters (feature points and texture) are available, the system is able to exploit this information for suitably modifying the geometry of the wireframe and for performing its animation by means of calibrated rules computed ex novo on the adapted somatics of the model. Evidence of the achievable performance is reported at the end of this paper by means of figures showing the capability of the system to reshape its geometry according to the decoded MPEG-4 facial calibration parameters and its effectiveness in performing facial expressions.


Journal ArticleDOI
TL;DR: The results of the present study replicate those of the only other study (Duclos et al., 1989) which has demonstrated specific effects of expressive behaviors on corresponding emotional feelings as mentioned in this paper.
Abstract: The results of numerous experimental studies have provided ample evidence for William James' theory that emotional conduct is a sufficient condition for the occurrence of emotional feelings. Two further questions are addressed in the study reported in this paper. First, critics have speculated that the effects of peripheral feedback from expressive bodily movement may lead to generalized, diffuse pleasant or unpleasant experiences, rather than the specific emotional feelings consistent with James' position. Second, if the Jamesian account is correct, then the simultaneous combination of multiple, consistent sources of expressive bodily feedback should result in greater magnitudes of emotional response than those caused by separate, individual sources. The results of the present study replicate those of the only other study (Duclos et al., 1989) which has demonstrated specific effects of expressive behaviors on corresponding emotional feelings. It was also possible to demonstrate, via correlational analyses, that those people who are responsive to their expressions tend to be responsive to their postures as well, since subjects in this study received manipulations of their facial expressions and their bodily postures. The results of this study also indicate that matching combinations of facial expressions and bodily postures result in more powerful feelings of the corresponding emotional feelings than do either expressions or postures alone. Copyright © 1999 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The present study demonstrated that the CFCS serves as a valid measurement tool for persistent pain in children.
Abstract: :Objective:The purposes of the study were threefold: (a) to determine whether a measurement system based on facial expression would be useful in the assessment of post-operative pain in young children; (b) to examine construct validity in terms of structure, consistency, and dynamics of the