scispace - formally typeset
Search or ask a question

Showing papers on "Facial expression published in 2002"


Journal ArticleDOI
Ralph Adolphs1
TL;DR: Investigations are being extended to nonhuman primates, to infants, and to patients with psychiatric disorders, to elucidate some of the mechanisms behind recognition of emotion from facial expressions.
Abstract: Recognizing emotion from facial expressions draws on diverse psychological processes implemented in a large array of neural structures. Studies using evoked potentials, lesions, and functional imaging have begun to elucidate some of the mechanisms. Early perceptual processing of faces draws on cortices in occipital and temporal lobes that construct detailed representations from the configuration of facial features. Subsequent recognition requires a set of structures, including amygdala and orbitofrontal cortex, that links perceptual representations of the face to the generation of knowledge about the emotion signaled, a complex set of mechanisms using multiple strategies. Although recent studies have provided a wealth of detail regarding these mechanisms in the adult human brain, investigations are also being extended to nonhuman primates, to infants, and to patients with psychiatric disorders.

1,288 citations


Journal ArticleDOI
TL;DR: Of regions in the extended system for face perception, the amygdala plays a central role in processing the social relevance of information gleaned from faces, particularly when that information may signal a potential threat.

1,224 citations


Journal ArticleDOI
TL;DR: Functional magnetic resonance imaging was used to measure activation in regions that responded differentially to faces with emotional expressions compared with neutral faces, and the modulation of these responses by attention was measured, using a competing task with a high attentional load.
Abstract: Attention gates the processing of stimuli relatively early in visual cortex. Yet, existing data suggest that emotional stimuli activate brain regions automatically, largely immune from attentional control. To resolve this puzzle, we used functional magnetic resonance imaging to first measure activation in regions that responded differentially to faces with emotional expressions (fearful and happy) compared with neutral faces. We then measured the modulation of these responses by attention, using a competing task with a high attentional load. Contrary to the prevailing view, all brain regions responding differentially to emotional faces, including the amygdala, did so only when sufficient attentional resources were available to process the faces. Thus, the processing of facial expression appears to be under top-down control.

1,154 citations


Journal ArticleDOI
TL;DR: Results suggest that the human amygdala shows a stronger response to affective facial expressions than to scenes, a bias that should be considered in the design of experimental paradigms interested in probing amygdala function.

883 citations


Journal ArticleDOI
TL;DR: In this paper, a comprehensive analysis of the schizophrenia facial affect recognition research over the past decade and the schizophrenia literature on affective prosody is provided, along with the first review on multichannel emotion recognition research.

735 citations


Journal ArticleDOI
TL;DR: It is concluded that emotional expression analysis and the structural encoding of faces are parallel processes and early emotional ERP modulations may reflect the rapid activation of prefrontal areas involved in the analysis of facial expression.
Abstract: Using event-related brain potentials (ERPs), we investigated the time course of facial expression processing in human subjects watching photographs of fearful and neutral faces. Upright fearful faces elicited a frontocentral positivity within120 ms after stimulus presentation, which was followed by a broadly distributed sustained positivity beyond 250 ms post-stimulus. Emotional expression eiects were delayed and attenuated when faces were inverted. In contrast, the face-speci¢c N170 component was completely unaiected by facial expression.We conclude that emotional expression analysis and the structural encoding of faces are parallel processes. Early emotional ERP modulations may re£ect the rapid activation of prefrontal areas involved in the analysis of facial expression. NeuroReport 13:1^5 c 2002 Lippincott Williams &W ilkins.

658 citations


Journal ArticleDOI
TL;DR: A method for accurately acquiring and reconstructing the geometry of the human face and for display of this reconstruction in a 3-dimensional format is described and applied in a sample of 70 actors and 69 actresses expressing happiness, sadness, anger, fear and disgust.

607 citations


Journal ArticleDOI
TL;DR: It is proposed that human expression of pain in the presence or absence of caregivers, and the detection of pain by observers, arises from evolved propensities.
Abstract: This paper proposes that human expression of pain in the presence or absence of caregivers, and the detection of pain by observers, arises from evolved propensities. The function of pain is to demand attention and prioritise escape, recovery, and healing; where others can help achieve these goals, effective communication of pain is required. Evidence is reviewed of a distinct and specific facial expression of pain from infancy to old age, consistent across stimuli, and recognizable as pain by observers. Voluntary control over amplitude is incomplete, and observers can better detect pain that the individual attempts to suppress rather than amplify or simulate. In many clinical and experimental settings, the facial expression of pain is incorporated with verbal and nonverbal vocal activity, posture, and movement in an overall category of pain behaviour. This is assumed by clinicians to be under operant control of social contingencies such as sympathy, caregiving, and practical help; thus, strong facial expression is presumed to constitute and attempt to manipulate these contingencies by amplification of the normal expression. Operant formulations support skepticism about the presence or extent of pain, judgments of malingering, and sometimes the withholding of caregiving and help. To the extent that pain expression is influenced by environmental contingencies, however, "amplification" could equally plausibly constitute the release of suppression according to evolved contingent propensities that guide behaviour. Pain has been largely neglected in the evolutionary literature and the literature on expression of emotion, but an evolutionary account can generate improved assessment of pain and reactions to it.

539 citations


Journal ArticleDOI
TL;DR: Failure to activate limbic regions during emotional valence discrimination may explain emotion processing deficits in patients with schizophrenia and may impact performance of more demanding tasks.
Abstract: OBJECTIVE: Emotion processing deficits are notable in schizophrenia. The authors evaluated cerebral blood flow response in schizophrenia patients during facial emotion processing to test the hypothesis of diminished limbic activation related to emotional relevance of facial stimuli. METHOD: Fourteen patients with schizophrenia and 14 matched comparison subjects viewed facial displays of happiness, sadness, anger, fear, and disgust as well as neutral faces. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as the subjects alternated between tasks of discriminating emotional valence (positive versus negative) and age (over 30 versus under 30) of the faces with an interleaved crosshair reference condition. RESULTS: The groups did not differ in performance on either task. For both tasks, healthy participants showed activation in the fusiform gyrus, occipital lobe, and inferior frontal cortex relative to the resting baseline condition. The increase was greate...

530 citations


Journal ArticleDOI
TL;DR: The findings suggest that the human amygdala is relatively specialized to process stimuli with complex social significance, and provides further support for the idea that some of the impairments in social cognition seen in patients with autism may result from dysfunction of the amygdala.
Abstract: Lesion, functional imaging, and single-unit studies in human and nonhuman animals have demonstrated a role for the amygdala in processing stimuli with emotional and social significance. We investigated the recognition of a wide variety of facial expressions, including basic emotions (e.g., happiness, anger) and social emotions (e.g., guilt, admiration, flirtatiousness). Prior findings with a standardized set of stimuli indicated that recognition of social emotions can be signaled by the eye region of the face and is disproportionately impaired in autism (Baron-Cohen, Wheelwright, & Jolliffe, 1997). To test the hypothesis that the recognition of social emotions depends on the amygdala, we administered the same stimuli to 30 subjects with unilateral amygdala damage (16 left, 14 right), 2 with bilateral amygdala damage, 47 brain-damaged controls, and 19 normal controls. Compared with controls, subjects with unilateral or bilateral amygdala damage were impaired when recognizing social emotions; moreover, they were more impaired in recognition of social emotions than in recognition of basic emotions, and, like previously described patients with autism, they were impaired also when asked to recognize social emotions from the eye region of the face alone. The findings suggest that the human amygdala is relatively specialized to process stimuli with complex social significance. The results also provide further support for the idea that some of the impairments in social cognition seen in patients with autism may result from dysfunction of the amygdala.

528 citations


Journal ArticleDOI
01 Aug 2002-Brain
TL;DR: The data suggest that social behavioural problems in patients with frontal lobe lesions or fronto-temporal dementia may be a consequence of dysfunction within the systems identified in light of their possible role in processing whether particular social behaviours are, or are not, appropriate.
Abstract: The aim of this investigation was to identify neural systems supporting the processing of intentional and unintentional transgressions of social norms. Using event-related fMRI, we addressed this question by comparing neural responses to stories describing normal behaviour, embarrassing situations or violations of social norms. Processing transgressions of social norms involved systems previously reported to play a role in representing the mental states of others, namely medial prefrontal and temporal regions. In addition, the processing of transgressions of social norms involved systems previously found to respond to aversive emotional expressions (in particular angry expressions); namely lateral orbitofrontal cortex (Brodmann area 47) and medial prefrontal cortex. The observed responses were similar for both intentional and unintentional social norm violations, albeit more pronounced for the intentional norm violations. These data suggest that social behavioural problems in patients with frontal lobe lesions or fronto-temporal dementia may be a consequence of dysfunction within the systems identified in light of their possible role in processing whether particular social behaviours are, or are not, appropriate.


Journal ArticleDOI
TL;DR: The authors examined the processes by which perceptual mechanisms become attuned to the contingencies of affective signals in the environment, and measured the sequential, content-based properties of feature detection in emotion recognition processes.
Abstract: The present research examines visual perception of emotion in both typical and atypical development. To examine the processes by which perceptual mechanisms become attuned to the contingencies of affective signals in the environment, the authors measured the sequential, content-based properties of feature detection in emotion recognition processes. To evaluate the role of experience, they compared typically developing children with physically abused children, who were presumed to have experienced high levels of threat and hostility. As predicted, physically abused children accurately identified facial displays of anger on the basis of less sensory input than did controls, which suggests that physically abused children have facilitated access to representations of anger. The findings are discussed in terms of experiential processes in perceptual learning.

Journal ArticleDOI
TL;DR: It is suggested that affective experiences can influence perceptual representations of basic emotions and alter the discriminative abilities that influence how children categorize angry facial expressions.
Abstract: A fundamental issue in human development concerns how the young infant's ability to recognize emotional signals is acquired through both biological programming and learning factors. This issue is extremely difficult to investigate because of the variety of sensory experiences to which humans are exposed immediately after birth. We examined the effects of emotional experience on emotion recognition by studying abused children, whose experiences violated cultural standards of care. We found that the aberrant social experience of abuse was associated with a change in children's perceptual preferences and also altered the discriminative abilities that influence how children categorize angry facial expressions. This study suggests that affective experiences can influence perceptual representations of basic emotions.

Journal ArticleDOI
TL;DR: Differences between subjects high and low in emotional empathy appeared to be related to differences in automatic somatic reactions to facial stimuli rather than to Differences in their conscious interpretation of the emotional situation.
Abstract: The hypotheses of this investigation were derived by conceiving of automatic mimicking as a component of emotional empathy. Differences between subjects high and low in emotional empathy were investigated. The parameters compared were facial mimicry reactions, as represented by electromyographic (EMG) activity when subjects were exposed to pictures of angry or happy faces, and the degree of correspondence between subjects' facial EMG reactions and their self-reported feelings. The comparisons were made at different stimulus exposure times in order to elicit reactions at different levels of information processing. The high-empathy subjects were found to have a higher degree of mimicking behavior than the low-empathy subjects, a difference that emerged at short exposure times (17-40 ms) that represented automatic reactions. The low-empathy subjects tended already at short exposure times (17-40 ms) to show inverse zygomaticus muscle reactions, namely "smiling" when exposed to an angry face. The high-empathy group was characterized by a significantly higher correspondence between facial expressions and self-reported feelings. No differences were found between the high- and low-empathy subjects in their verbally reported feelings when presented a happy or an angry face. Thus, the differences between the groups in emotional empathy appeared to be related to differences in automatic somatic reactions to facial stimuli rather than to differences in their conscious interpretation of the emotional situation.

Journal ArticleDOI
TL;DR: In this article, the authors explore whether spontaneously evoked facial reactions can be evaluated in terms of criteria for what characterises an automatic process, based on a model in which the facial muscles can be both automatically/ involuntarily controlled and voluntarily controlled by conscious processes.
Abstract: Based on a model in which the facial muscles can be both automatically/ involuntarily controlled and voluntarily controlled by conscious processes, we explore whether spontaneously evoked facial reactions can be evaluated in terms of criteria for what characterises an automatic process. In three experiments subjects were instructed to not react with their facial muscles, or to react as quickly as possible by wrinkling the eyebrows (frowning) or elevating the cheeks (smiling) when exposed to pictures of negative or positive emotional stimuli, while EMG activity was measured from the corrugator supercilii and zygomatic major muscle regions. Consistent with the proposition that facial reactions are automatically controlled, the results showed that the corrugator muscle reaction was facilitated to negative stimuli and the zygomatic muscle reaction was facilitated to positive stimuli. The results further showed that, despite the fact that subjects were required to not react with their facial muscles at all, they could not avoid producing a facial reaction that corresponded to the negative and positive stimuli.

Journal ArticleDOI
TL;DR: The human face is an evolved adaptation for social communication that implies that humans are genetically prepared to produce facial gestures that are automatically decoded by observers, but the amygdala may not be specialized for processing emotional faces, but may instead respond to faces because they provide important information for the defense appraisal that is its primary responsibility.
Abstract: The human face is an evolved adaptation for social communication. This implies that humans are genetically prepared to produce facial gestures that are automatically decoded by observers. Psychophysiological data demonstrate that humans respond automatically with their facial muscles, with autonomic responses, and with specific regional brain activation of the amygdala when exposed to emotionally expressive faces. Attention is preferentially and automatically oriented toward facial threat. Neuropsychological data, as well as a rapidly expanding brain-imaging literature, implicate the amygdala as a central structure for responding to negative emotional faces, and particularly to fearful ones. However, the amygdala may not be specialized for processing emotional faces, but may instead respond to faces because they provide important information for the defense appraisal that is its primary responsibility.

Journal ArticleDOI
TL;DR: The results support the central role of the amygdala in emotion processing, and indicate its sensitivity to the task relevance of the emotional display.

Journal ArticleDOI
Elaine Fox1
TL;DR: In two experiments, an attentional bias toward fearful facial expressions was observed, although this bias was apparent only for those reporting high levels of trait anxiety and only when the emotional face was presented in the left visual field.
Abstract: In this paper, the role of self-reported anxiety and degree of conscious awareness as determinants of the selective processing of affective facial expressions is investigated. In two experiments, an attentional bias toward fearful facial expressions was observed, although this bias was apparent only for those reporting high levels of trait anxiety and only when the emotional face was presented in the left visual field. This pattern was especially strong when the participants were unaware of the presence of the facial stimuli. In Experiment 3, a patient with right-hemisphere brain damage and visual extinction was presented with photographs of faces and fruits on unilateral and bilateral trials. On bilateral trials, it was found that faces produced less extinction than did fruits. Moreover, faces portraying a fearful or a happy expression tended to produce less extinction than did neutral expressions. This suggests that emotional facial expressions may be less dependent on attention to achieve awareness. The implications of these results for understanding the relations between attention, emotion, and anxiety are discussed.

Journal ArticleDOI
TL;DR: Results from 5 experiments provide converging evidence that automatic evaluation of faces in sequential priming paradigms reflects affective responses to phenotypic features per se rather than evaluation of the racial categories to which the faces belong, and suggest the existence of 2 distinct types of prejudice.
Abstract: Results from 5 experiments provide converging evidence that automatic evaluation of faces in sequential priming paradigms reflects affective responses to phenotypic features per se rather than evaluation of the racial categories to which the faces belong. Experiment 1 demonstrates that African American facial primes with racially prototypic physical features facilitate more automatic negative evaluations than do other Black faces that are unambiguously categorizable as African American but have less prototypic features. Experiments 2, 3, and 4 further support the hypothesis that these differences reflect direct affective responses to physical features rather than differential categorization. Experiment 5 shows that automatic responses to facial primes correlate with cue-based but not category-based explicit measures of prejudice. Overall, these results suggest the existence of 2 distinct types of prejudice.

Journal ArticleDOI
TL;DR: A broader role for the amygdala in modulating the vigilance level during the perception of several negative and positive facial emotions is suggested.
Abstract: Most theories of amygdalar function have underscored its role in fear. One broader theory suggests that neuronal activation of the amygdala in response to fear-related stimuli represents only a portion of its more widespread role in modulating an organism’s vigilance level.To further explore this theory, the amygdalar response to happy, sad, angry, fearful, and neutral faces in 17 subjects was characterized using 3 T fMRI. Utilizing a random eiects model and hypothesis-driven analytic strategy, it was observed that each of the four emotional faces was associated with reliable bilateral activation of the amygdala compared with neutral. These ¢ndings suggest a broader role for the amygdala in modulating the vigilance level during the perception of several negative and positive facial emotions. NeuroReport13:1737^1741 � c 2002 Lippincott Williams & Wilkins.

Journal ArticleDOI
TL;DR: A feature-based face recognition system based on both 3D range data as well as 2D gray-level facial images and the best match in the model library is identified according to similarity function or Support Vector Machine.

Journal ArticleDOI
01 Mar 2002-Emotion
TL;DR: The results support the facial feedback hypothesis and suggest that facial feedback has more powerful effects when facial configurations represent valid analogs of basic emotional expressions.
Abstract: This study examined the modulatory function of Duchenne and non-Duchenne smiles on subjective and autonomic components of emotion. Participants were asked to hold a pencil in their mouth to either facilitate or inhibit smiles and were not instructed to contract specific muscles. Five conditions--namely lips pressing, low-level non-Duchenne smiling, high-level non-Duchenne smiling, Duchenne smiling, and control--were produced while participants watched videoclips that were evocative of positive or negative affect. Participants who displayed Duchenne smiles reported more positive experience when pleasant scenes and humorous cartoons were presented. Furthermore, they tended to exhibit different patterns of autonomic arousal when viewing positive scenes. These results support thefacial feedback hypothesis and suggest that facial feedback has more powerful effects when facial configurations represent valid analogs of basic emotional expressions.

Journal ArticleDOI
TL;DR: The differential developmental course of speed and accuracy levels indicates that speed is a more sensitive measure when children get older and suggests that speed of performance, in addition to accuracy, might be successfully used in the assessment of clinical deficits, as has recently been demonstrated in children with autistic disorders of social contact.
Abstract: As yet, nearly all studies in face and facial affect recognition typically provide only data on the accuracy of processing, invariably also in the absence of reference data on information processing. In this study, accuracy and speed of abstract visuo-spatial processing, face recognition, and facial emotion recognition were investigated in normal school children (7–10 years) and adults (25±4 years). In the age range of 7–10 years, accuracy of facial processing hardly increased, while speed did substantially increase with age. Adults, however, were substantially more accurate and faster than children. Differences between facial and abstract information processing were related to type of processing strategy, that is, configural or holistic processing versus featural or piecemeal processing. Improvement in task performance with age is discussed in terms of an enhanced efficiency of the configural organization of facial knowledge (facial information processing tasks), together with a further increase in proce...

Journal ArticleDOI
TL;DR: This article shows that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories of facial expression recognition.
Abstract: There are two competing theories of facial expression recognition. Some researchers have suggested that it is an example of "categorical perception." In this view, expression categories are considered to be discrete entities with sharp boundaries, and discrimination of nearby pairs of expressive faces is enhanced near those boundaries. Other researchers, however, suggest that facial expression perception is more graded and that facial expressions are best thought of as points in a continuous, low-dimensional space, where, for instance, "surprise" expressions lie between "happiness" and "fear" expressions due to their perceptual similarity. In this article, we show that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories. Without any parameter tuning, the model matches a variety of psychological data on categorization, similarity, reaction times, discrimination, and recognition difficulty, both qualitatively and quantitatively. We thus explain many of the seemingly complex psychological phenomena related to facial expression perception as natural consequences of the tasks' implementations in the brain.

Journal ArticleDOI
TL;DR: Testing six patients with frontal variant frontotemporal dementia on a series of face perception tasks (including facial identity and facial expression recognition), and a test of vocal emotion recognition shows results consistent with the idea that fvFTD affects the recognition of emotional signals from multiple modalities rather than facial expression processing alone.

Journal ArticleDOI
TL;DR: This paper investigated the accuracy of judging intelligence from facial photos of strangers across the lifespan, facial qualities contributing to accuracy, and developmental paths producing correlations between facial qualities and IQ scores, finding that judges' accuracy was more accurate than chance in childhood and puberty, marginally more accurate in middle adulthood, but not more accurate compared to chance in adolescence or late adulthood.
Abstract: The authors investigated accuracy of judging intelligence from facial photos of strangers across the lifespan, facial qualities contributing to accuracy, and developmental paths producing correlations between facial qualities and IQ scores. Judgments were more accurate than chance in childhood and puberty, marginally more accurate in middle adulthood, but not more accurate than chance in adolescence or late adulthood. Reliance on the valid cue of facial attractiveness could explain judges’ accuracy. Multiple developmental paths contributed to relationships between facial attractiveness and IQ: biological, environmental, influences of intelligence on attractiveness, influences of attractiveness on intelligence. The findings provide a caveat to evolutionary psychologists’ assumption that relationships between attractiveness and intelligence or other traits reflect an influence of “good genes” on both, as well as to social and developmental psychologists’ assumption that such relationships reflect self-fulfi...

Journal ArticleDOI
01 Oct 2002-Brain
TL;DR: The data indicate that tvFTD is associated with impairments in emotional processing that may underlie some behavioural problems in this disorder, and that the emergence of such deficits depends on the specific pattern of anatomical injury.
Abstract: Frontotemporal dementia (FTD) is a neurodegenerative disease characterized by behavioural disorders that suggest abnormalities of emotional processing. Patients with the temporal variant of FTD (tvFTD) are particularly at risk for developing deficits in emotional processing secondary to atrophy in the amygdala, anterior temporal cortex (ATC) and orbital frontal cortex (OFC), structures that are components of the brain's emotional processing systems. In addition, previous studies have suggested that predominantly right, as opposed to left temporal atrophy is more likely to be associated with behavioural and emotional impairments in tvFTD. However, emotional processing has never been assessed directly in this group. We examined one aspect of emotional processing, namely the comprehension of facial expressions of emotion (emotional comprehension) in nine individuals with tvFTD, and correlated performance on this measure with atrophy (as measured from T(1)-weighted MRI scans by region of interest analysis) in the amygdala, ATC and OFC. Compared with age-matched controls, the tvFTD group was impaired in emotional comprehension, with more severe impairment for emotions with negative valence, including sadness, anger and fear, than for happiness. Emotional comprehension was correlated with atrophy in the right amygdala and the right OFC, and not with atrophy in other structures. When individual profiles of amygdala atrophy were examined across patients and compared with control values, right amygdala atrophy was always accompanied by left amygdala atrophy, whereas patients with volume loss in the left amygdala could have normal or decreased right amygdala volumes. Thus, emotional comprehension appeared to be most impaired when bilateral amygdala atrophy was present, and was not associated with the degree of left amygdala atrophy. Our data indicate that tvFTD is associated with impairments in emotional processing that may underlie some behavioural problems in this disorder, and that the emergence of such deficits depends on the specific pattern of anatomical injury. These results have implications both for the clinical presentation in tvFTD patients and for the study of the neuroanatomical basis of emotion.

Journal ArticleDOI
01 Dec 2002-Emotion
TL;DR: Psychopaths were less accurate than nonpsychopaths at classifying facial affect under conditions promoting reliance on right-hemisphere resources and displayed a specific deficit in classifying disgust.
Abstract: Prior studies provide consistent evidence of deficits for psychopaths in processing verbal emotional material but are inconsistent regarding nonverbal emotional material. To examine whether psychopaths exhibit general versus specific deficits in nonverbal emotional processing, 34 psychopaths and 33 nonpsychopaths identified with Hare's (R. D. Hare, 1991) Psychopathy Checklist--Revised were asked to complete a facial affect recognition test. Slides of prototypic facial expressions were presented. Three hypotheses regarding hemispheric lateralization anomalies in psychopaths were also tested (right-hemisphere dysfunction, reduced lateralization, and reversed lateralization). Psychopaths were less accurate than nonpsychopaths at classifying facial affect under conditions promoting reliance on right-hemisphere resources and displayed a specific deficit in classifying disgust. These findings demonstrate that psychopaths exhibit specific deficits in nonverbal emotional processing.

Journal ArticleDOI
01 Mar 2002-Emotion
TL;DR: For instance, this article found that the right frontoparietal operculum, the bilateral frontal pole, and the left frontal cortex were associated with emotion recognition from prosody and facial expressions.
Abstract: Which brain regions are associated with recognition of emotional prosody? Are these distinct from those for recognition of facial expression? These issues were investigated by mapping the overlaps of co-registered lesions from 66 brain-damaged participants as a function of their performance in rating basic emotions. It was found that recognizing emotions from prosody draws on the right frontoparietal operculum, the bilateral frontal pole, and the left frontal operculum. Recognizing emotions from prosody and facial expressions draws on the right frontoparietal cortex, which may be important in reconstructing aspects of the emotion signaled by the stimulus. Furthermore, there were regions in the left and right temporal lobes that contributed disproportionately to recognition of emotion from faces or prosody, respectively.