scispace - formally typeset
Search or ask a question

Showing papers on "Facial expression published in 1993"


Journal ArticleDOI
TL;DR: Responsibility specificity, particularly facial expressiveness, supported the view that specific affects have unique patterns of reactivity, and consistency of the dimensional relationships between evaluative judgments and physiological response emphasizes that emotion is fundamentally organized by these motivational parameters.
Abstract: Colored photographic pictures that varied widely across the affective dimensions of valence (pleasant-unpleasant) and arousal (excited-calm) were each viewed for a 6-s period while facial electromyographic (zygomatic and corrugator muscle activity) and visceral (heart rate and skin conductance) reactions were measured. Judgments relating to pleasure, arousal, interest, and emotional state were measured, as was choice viewing time. Significant covariation was obtained between (a) facial expression and affective valence judgments and (b) skin conductance magnitude and arousal ratings. Interest ratings and viewing time were also associated with arousal. Although differences due to the subject's gender and cognitive style were obtained, affective responses were largely independent of the personality factors investigated. Response specificity, particularly facial expressiveness, supported the view that specific affects have unique patterns of reactivity. The consistency of the dimensional relationships between evaluative judgments (i.e., pleasure and arousal) and physiological response, however, emphasizes that emotion is fundamentally organized by these motivational parameters.

3,089 citations


Journal ArticleDOI
TL;DR: In this paper, cross-cultural research on facial expression and the developments of methods to measure facial expression are summarized and what has been learned about emotion from this work on the face is elucidated.
Abstract: Cross-cultural research on facial expression and the developments of methods to measure facial expression are briefly summarized. What has been learned about emotion from this work on the face is then elucidated. Four questions about facial expression and emotion are discussed: What information does an expression typically convey? Can there be emotion without facial expression? Can there be a facial expression of emotion without emotion? How do individuals differ in their facial expressions of emotion?

2,155 citations


Journal ArticleDOI
TL;DR: An estimation technique that uses deformable contour models (snakes) to track the nonrigid motions of facial features in video images is developed and estimates muscle actuator controls with sufficient accuracy to permit the face model to resynthesize transient expressions.
Abstract: An approach to the analysis of dynamic facial images for the purposes of estimating and resynthesizing dynamic facial expressions is presented. The approach exploits a sophisticated generative model of the human face originally developed for realistic facial animation. The face model which may be simulated and rendered at interactive rates on a graphics workstation, incorporates a physics-based synthetic facial tissue and a set of anatomically motivated facial muscle actuators. The estimation of dynamical facial muscle contractions from video sequences of expressive human faces is considered. An estimation technique that uses deformable contour models (snakes) to track the nonrigid motions of facial features in video images is developed. The technique estimates muscle actuator controls with sufficient accuracy to permit the face model to resynthesize transient expressions. >

602 citations


Journal ArticleDOI
TL;DR: The authors found that emotional and non-emotional feelings of effort were influenced by contraction of the forehead muscle (corrugator), and Ss' self-ratings on a trait dimension reflected this experience when the facial contraction was maintained during the recall of behavioral episodes exemplifying this trait.
Abstract: This article reports 2 experiments that test whether both emotional and nonemotional feelings may be influenced by uninterpreted proprioceptive input. The logic of the procedure was adopted from studies by F. Strack, L. Martin, and S. Stepper (1988), who unobtrusively manipulated people's facial expressions. In the 1st experiment, a functionally equivalent technique was used to vary the posture of the body. Study 1 results revealed that success at an achievement task led to greater feelings of pride if the outcome was received in an upright position rather than in a slumped posture. Study 2 results revealed that nonemotional feelings of effort were influenced by contraction of the forehead muscle (corrugator), and Ss' self-ratings on a trait dimension reflected this experience when the facial contraction was maintained during the recall of behavioral episodes exemplifying this trait. To account for these results, a framework is proposed that draws on a distinction between noetic and experiential representations.

585 citations


Journal ArticleDOI
TL;DR: Simulations with long image sequences of real-world scenes indicate that the approach to estimating the motion of the head and facial expressions in model-based facial image coding not only greatly reduces computational complexity but also substantially improves estimation accuracy.
Abstract: An approach to estimating the motion of the head and facial expressions in model-based facial image coding is presented. An affine nonrigid motion model is set up. The specific knowledge about facial shape and facial expression is formulated in this model in the form of parameters. A direct method of estimating the two-view motion parameters that is based on the affine method is discussed. Based on the reasonable assumption that the 3-D motion of the face is almost smooth in the time domain, several approaches to predicting the motion of the next frame are proposed. Using a 3-D model, the approach is characterized by a feedback loop connecting computer vision and computer graphics. Embedding the synthesis techniques into the analysis phase greatly improves the performance of motion estimation. Simulations with long image sequences of real-world scenes indicate that the method not only greatly reduces computational complexity but also substantially improves estimation accuracy. >

405 citations


Journal ArticleDOI
01 Aug 1993-Brain
TL;DR: Response latency data confirmed the finding of a selective deficit in the processing of facial expressions, but produced evidence suggesting that impairments affecting familiar face recognition and unfamiliar face matching were not completely independent from each other in this group of ex-servicemen.
Abstract: SUMMARY Current theoretical models of face perception postulate separate routes for processing information needed in the recognition of a familiar face, for matching photographs of unfamiliar faces and for the analysis of facial expressions. The present study investigated this claim in a group of ex-servicemen who had sustained unilateral brain injuries affecting posterior areas of the left or right cerebral hemisphere. Care was taken to confirm the nature of impairment by using two different tasks to assess each of the three theoretically defined abilities (leading to a total of six tasks). We adopted a stringent application of the double dissociation methodology to investigate the pattern of performance across tasks of individual ex-servicemen. A selective impairment was defined as a significantly impoverished performance on both tests of a specific ability, while all other tasks were performed within normal limits. In addition, we used both accuracy and response latency measures to substantiate evidence for spared or defective abilities. The results showed selective impairments of all three abilities on accuracy scores. Response latency data confirmed the finding of a selective deficit in the processing of facial expressions, but produced evidence suggesting that impairments affecting familiar face recognition and unfamiliar face matching were not completely independent from each other in this group of ex-servicemen.

298 citations


Journal ArticleDOI
TL;DR: It is concluded that inappropriate reactions to others' emotions may maintain or increase depression.
Abstract: The present study investigated the recognition of, and responses to, facial expressions of emotion. Participants were all women and consisted of the following groups: (a) Sixteen depressed college students; (b) 16 nondepressed college students; (c) 16 depressed psychiatric patients; and (d) 11 nondepressed psychiatric patients. Results suggest that both depressed groups, relative to the nondepressed college group, made more errors in recognizing the facial expressions and reported more freezing or tensing; higher fear and depression reactions; and less comfort with their own emotional reactions to these expressions and a stronger desire to change these reactions. Few differences were found between the depressed psychiatric patients and the psychiatric control subjects. It is concluded that inappropriate reactions to others' emotions may maintain or increase depression.

282 citations


Journal ArticleDOI
TL;DR: Four experiments investigating recognition of emotional expressions in very briefly presented facial stimulus found that stimulus onset asynchrony between target and mask proved to be the principal factor influencing recognition of the masked expressions.
Abstract: Four experiments are reported investigating recognition of emotional expressions in very briefly presented facial stimulus. The faces were backwardly masked by neutral facial displays and recognition of facial expressions was analyzed as a function of the manipulation of different parameters in the masking procedure. The main conclusion was that stimulus onset asynchrony between target and mask proved to be the principal factor influencing recognition of the masked expressions. In general, confident recognitions of facial expressions required about 100-150 msec, with shorter time for happy than for angry expressions. The manipulation of the duration of both the target and the mask, by itself, had only minimal effects.

268 citations


Journal ArticleDOI
TL;DR: Data contrasting the processing of facial identity from static photographs, and facial expression from static and moving images, in two patients with face processing impairments are reported, indicating the separate encoding of expression from moving and static images.

265 citations


Journal ArticleDOI
TL;DR: The results suggest the presence of deficits in the perception of nonverbal emotion in alexithymia, which was significantly less able to recognize facial expressions of emotions than the low alexithsymia group.
Abstract: Slides of photographs depicting posed facial expressions of nine different emotions were presented to 131 females and 85 males who were asked to identify the emotion(s) being experienced by the person in each photograph. Subjects were then administered the 20-item version of the Toronto Alexithymia Scale; the 33rd and 66th percentiles were used to categorize subjects into high, moderate, and low alexithymia groups. Results showed that the high alexithymia group was significantly less able to recognize facial expressions of emotions than the low alexithymia group. There was no significant effect for gender on the ability to recognize facial emotions. The results suggest the presence of deficits in the perception of nonverbal emotion in alexithymia.

262 citations


Journal ArticleDOI
TL;DR: Results suggest that the higher order functional neural network for recognizing emotion in visual input likely involves the right anterior cingulate and the bilateral inferior frontal gyri.
Abstract: The functional neuroanatomy of emotion recognition is inadequately understood despite well-documented clinical situations where emotion recognition is impaired (aprosodia). Oxygen-15 water positron-emission tomography (PET) was used to study 9 healthy women volunteers during three match-to-sample conditions, each repeated twice: a study task matching facial emotions and control tasks matching spatial positions or facial identity. Results suggest that the higher order functional neural network for recognizing emotion in visual input likely involves the right anterior cingulate and the bilateral inferior frontal gyri. Language: en

Journal ArticleDOI
TL;DR: In this paper, the authors measured emotional intensity from facial expressions and found that emotional intensity was positively correlated with the number of facial expressions. The Journal of Social Psychology: Vol. 133, No. 5, pp. 749-750.
Abstract: (1993). The Measurement of Emotional Intensity from Facial Expressions. The Journal of Social Psychology: Vol. 133, No. 5, pp. 749-750.

Journal ArticleDOI
TL;DR: In this paper, a review of selective behavioral, psychophysiological, and neuropsychological research bearing on how affective space should be parsed is presented, and the conceptual and methodological implications of this perspective are considered.
Abstract: This article reviews selective behavioral, psychophysiological, and neuropsychological research bearing on how affective space should be parsed. Neither facial expression nor autonomic nervous system activity is found to provide unique markers for particular discrete emotions. The dimensions of approach and withdrawal are introduced as fundamental systems relevant to differentiati ng affective space. The role of frontal and anterior temporal asymmetries in mediating approach- and withdrawal-related emotion is considered. Individual differences in tonic anterior activation asymmetry are present and are relatively stable over time. Such differences are associated with an individual's propensity to display different types of emotion, mood, and psychopathology. The conceptual and methodological implications of this perspective are considered. The purpose of this brief article is to review theory and data derived mostly from neuropsychological and psychophysiological analyses that bear on the issue of how affective space should be parsed. Included within this inquiry is how emotions are organized and differentiated from one another and the nature of the relations among the various subcomponents that compose emotions. Within this context I consider such issues as whether emotion is best understood as categorical or dimensional, the relation between facial expressions of emotion and emotional states, and psychophysiological differentiation among emotions. Space constraints preclude an exhaustive treatment of these topics. Studies are cited for illustrative purposes with an emphasis on recent research from my laboratory.1 My principal goal is to build a case for the importance of the approach versus withdrawal dimension


Journal ArticleDOI
TL;DR: Model-based encoding of human facial features for narrowband visual communication based on an already prepared 3D human model detects and understands a person's body motion and facial expressions and becomes the basis for modifying the 3D model of the person and thereby generating lifelike human images.
Abstract: Model-based encoding of human facial features for narrowband visual communication is described. Based on an already prepared 3D human model, this coding method detects and understands a person's body motion and facial expressions. It expresses the essential information as compact codes and transmits it. At the receiving end, this code becomes the basis for modifying the 3D model of the person and thereby generating lifelike human images. The feature extraction used by the system to acquire data for regions or edges that express the eyes, nose, mouth, and outlines of the face and hair is discussed. The way in which the system creates a 3D model of the person by using the features extracted in the first part to modify a generic head model is also discussed. >

Journal ArticleDOI
23 Jul 1993-Science
TL;DR: It is found that the right hemisphere determines facial expression, and the left hemisphere processes species-typical vocal signals, suggests that human and nonhuman primates exhibit the same pattern of brain asymmetry for communication.
Abstract: In humans, the left side of the face (right hemisphere of the brain) is dominant in emotional expression. In rhesus monkeys, the left side of the face begins to display facial expression earlier than the right side and is more expressive. Humans perceive rhesus chimeras created by pairing the left half of the face with its mirror-reversed duplicate as more expressive than chimeras created by right-right pairings. That the right hemisphere determines facial expression, and the left hemisphere processes species-typical vocal signals, suggests that human and nonhuman primates exhibit the same pattern of brain asymmetry for communication.

Journal ArticleDOI
TL;DR: In most tests, reaction time was found to increase steeply with sample size, thus indicating serial-search characteristics for the patterns tested, and there were considerable differences in the slopes of the graphs (search time versus sample size), which could be attributed to visual cues that are discriminated at similar speeds.
Abstract: Subjects were asked to detect faces or facial expressions from patterns with a variable number of nonfaces or faces expressing different emotions. In most tests, reaction time was found to increase steeply with sample size, thus indicating serial-search characteristics for the patterns tested. There were, however, considerable differences in the slopes of the graphs (search time versus sample size), which could be attributed to visual (but not face) cues that are discriminated at similar speeds. Slopes did not change when patterns were presented upside down, although such a modification strongly affects the perception of faces and facial expressions.

Journal ArticleDOI
TL;DR: In this article, the effects of visuospatial and facial processing on facial emotion tasks, the Visual Matrices Test and the Benton Facial Recognition Test were administered, and some support was provided for the notion that negative-symptom schizophrenia is associated with right hemisphere dysfunction.
Abstract: Deficits in the perception of facial emotion have been demonstrated in patients with right-sided brain damage (RBD) and schizophrenia (SZ). Furthermore, recent speculations have implicated right-hemisphere dysfunction in Type II schizophrenics, especially those with a preponderance of "negative symptoms" and flat affect. The performance of SZ, RBD, and normal control subjects was compared on measures assessing facial emotional perception. Both identification and discrimination paradigms were used, with positive/pleasant and negative/unpleasant emotions. To examine the effects of visuospatial and facial processing on facial emotion tasks, the Visual Matrices Test and the Benton Facial Recognition Test were administered. On both facial emotion tests, SZ and RBD patients were significantly impaired relative to normal subjects, but not different from each other. The SZ and RBD patients were also impaired on the matrices and facial recognition tests. When the effects of the matrices and neutral face recognition tests were statistically controlled, significant group differences remained for the identification task but not for the discrimination task. Thus, methodologies are presented for the neuropsychological study of facial emotional perception, and some support is provided for the notion that negative-symptom schizophrenia is associated with right hemisphere dysfunction.

Proceedings ArticleDOI
01 May 1993
TL;DR: This work is attempting to introduce facial displays into computer-human interaction as a new modality to make the interaction tighter and more efficient while lessening the cognitive load.
Abstract: The human face is an independent communication channel that conveys emotional and conversational signals encoded as facial displays. Facial displays can be viewed as communicative signals that help coordinate conversation. We are attempting to introduce facial displays into computer-human interaction as a new modality. This will make the interaction tighter and more efficient while lessening the cognitive load. As the first step, a speech dialogue system was selected to investigate the power of communicative facial displays. We analyzed the conversations between users and the speech dialogue system, to which facial displays had been added. We found that conversation with the system featuring facial displays was more successful than that with a system without facial displays.

Journal ArticleDOI
TL;DR: It is proposed that the effectiveness of many common expressive therapies would be enhanced if clients are encouraged to both express their feelings nonverbally and to put their experiences into words.
Abstract: The spontaneous nonverbal expression of emotion is related to immediate reductions in autonomic nervous system activity. Similar changes in specific autonomic channels occur when individuals are encouraged to verbally express their emotions. Indeed, these physiological changes are most likely to occur among individuals who are either verbally or nonverbally highly expressive. These data suggest that when individuals must actively inhibit emotional expression, they are at increased risk for a variety of health problems. Several experiments are summarized which indicate that verbally expressing traumatic experiences by writing or talking improves physical health, enhances immune function, and is associated with fewer medical visits. Although less research is available regarding nonverbal expression, it is also likely that the nonverbal expression of emotion bears some relation to health status. We propose that the effectiveness of many common expressive therapies (e.g., art, music, cathartic) would be enhanced if clients are encouraged to both express their feelings nonverbally and to put their experiences into words.

Journal ArticleDOI
TL;DR: In this paper, a methodological note on a potential problem with a forced-choice response scale in the study of facial expressions of emotion is presented. But the authors do not consider the effect of forced choice on the performance of emotion classification.
Abstract: This article is a methodological note on a potential problem with a forced-choice response scale in the study of facial expressions of emotion. For example, a majority of subjects categorized Matsumoto and Ekman's (1988) reported facial expression of “anger” as contempt when using one forced-choice format, as disgust, with another format, and as frustration, with a third. When shown the anger expression and given a choice amonganger, frustration, and other labels, few subjects (12.5% on average) selectedanger. Ifcontempt, digust, andfrustration are considered wrong answers, then forced choice can yield consensus on the wrong answer; ifanger is the right answer, then forced choice can fail to yield consensus on the right answer.

Journal ArticleDOI
Rita J. Casey1
TL;DR: This article assessed the influence of social evaluation on children's emotional experience and understanding, and found that children displayed more positive and negative emotion than boys in response to social feedback and were also more accurate in reporting their initial facial expression.
Abstract: This study assessed the influence of social evaluation on children's emotional experience and understanding. Sixty-six younger and older children (M ages = 7.12 and 12.06 years) were videotaped as they played a game, during which they received mild positive or negative feedback from another child of the same age and gender. Children's emotion report and understanding of their emotional responses were obtained in a postgame interview. Feedback valence influenced children's emotion expression, self-report, and their understanding of emotion. Girls displayed more positive and negative emotion than boys in response to social feedback and were also more accurate in reporting their initial facial expression

Journal ArticleDOI
01 Jun 1993-Cortex
TL;DR: The results showed that the right hemisphere was more accurate and faster than the left in recognizing the stimulus faces, and that positive emotions were overall more easily recognized.

Journal ArticleDOI
TL;DR: The authors examined changes in the perception of facial emotion across the adult life span and found no overall changes in accuracy of perception as a function of age, and the ages of posers in the photographs did not influence the perceptual accuracy scores.
Abstract: This study was designed to examine changes in the perception of facial emotion across the adult life span. Subjects were 30 young (ages 21 to 39 years), 30 middle‐aged (ages 40 to 59 years), and 30 older (ages 60 to 81 years) normal adult right‐handed females. The three groups of subjects were carefully screened for neurological and psychiatric disorders and for cognitive and visuoperceptual deficits; the groups were closely matched on demographic variables. Subjects were required to identify photographs of facial emotional expressions (Ekman & Friesen, 1976). There were no overall changes in accuracy of perception as a function of age. The ages of posers in the photographs did not influence the perceptual accuracy scores. This study contributes to the literature about the characteristics of normal emotional processing across the adult life span.

Journal ArticleDOI
TL;DR: Overall, children with ADHD were found to be no different from normal children in their ability to process emotional cues, and significant differences were found between ADHD and normal control groups on those tasks requiring complex auditory processing and extensive use of working memory.
Abstract: A possible etiological factor for the social disability described in children with attention‐deficit hyperactivity disorder (ADHD) is a deficit in the ability to accurately evaluate emotional stimuli. Children with ADHD were compared to normal controls on a battery of emotional processing. This battery, the Minnesota Tests of Affective Processing (MNTAP), measures face perception and recognition of affective stimuli as conveyed via facial expression, language, and speech prosody. Overall, children with ADHD were found to be no different from normal children in their ability to process emotional cues. A subgroup of younger children with ADHD were found to have modest difficulties on a test of decoding facial affective stimuli. Significant differences were found between ADHD and normal control groups on those tasks requiring complex auditory processing and extensive use of working memory. Additional analyses found significant effects on affective processing for children with nonverbal impairment We conclude...

Journal ArticleDOI
TL;DR: It is found that RHD patients showed reduced facial expressivity in comparison to both LHD and NHD subjects during spontaneous conversation, which supports the hypothesis that the right hemisphere mediates facialexpressivity during spontaneous social interaction.

Journal ArticleDOI
TL;DR: In this article, the authors explored the validity of 16 facial movements (e.g., eyelid widening, lips part) and two psychophysiological responses as interest-associated behaviors.
Abstract: The present paper explores the validity of 16 facial movements (e.g., eyelid widening, lips part) and two psychophysiological responses (e.g., heart rate) as interest-associated behaviors. In a pilot study we selected interesting and uninteresting stimuli, and in two experiments we asked undergraduate volunteers to watch and listen to a series of 4-min film clips and self-report their level of interest. As each participant viewed the films, we videotaped, coded, and scored his or her facial movements and recorded the autonomic responses. Using repeated-measure ANOVAs and correlational analyses, we found support for five upper facial behaviors (eyes closed, number of eye glances, duration of eye glances, eyelid widening, exposed eyeball surface), one lower facial behavior (lips part), and two general head movements (head turns, head stillness) as interest-associated facial movements. The discussion focuses on how these findings confirm, contradict, and clarify the observations of others (e.g., Darwin, Tomkins, Izard).

Journal ArticleDOI
TL;DR: In this paper, a functionalist perspective on the development of nonverbal communication of emotion is presented, where emphasis is placed on the functional implications of emotion-relevant movements for social regulation (communication), intrapersonal (internal) regulation, and behavior regulation.
Abstract: A functionalist perspective on the development of nonverbal communication of emotion is presented. This perspective is distinguished from other current conceptualizations by the following features: (a) Emphasis is placed on the functional implications of emotion-relevant movements for social regulation (communication), intrapersonal (internal) regulation, and behavior regulation. (b) Emotions are viewed as “members of families of emotions.” Emotion families are composed of emotion processes with similar functional relationships to the environment, which also differ in particular communicative features as a function of contextual demands, socialization history, and developmental abilities of the organism. (c) Facial movements are treated as only one of many forms of communication of emotion, rather than as having special status as “the” clearcut indicators of emotion. (d) Communication of emotion always is embedded in a context: There are no movements that can be considered clearcut, context-free expressions of emotion, at any period of development. (e) The role of socialization in the development of emotion and emotion communication is emphasized. (f) The multiple influences on communicative behavior, and the implications of such multicausality for clearcut communication, are acknowledged.

Journal ArticleDOI
TL;DR: In this article, the authors examined infant facial expressions longitudinally at 2, 4, and 6 months of age during face-to-face play and a still-face interaction with their mothers.
Abstract: Differential emotions theory (DET) proposes that infant facial expressions of emotions are differentiated. To test this hypothesis, we examined infant facial expressions longitudinally at 2, 4, and 6 months of age during face-to-face play and a «still-face» interaction with their mothers. Infant expressions were coded using the Maximally Discriminative Facial Movement Coding System (Max). Consistent with DET, discrete positive expressions occurred more of the time and were of longer duration than blended expressions of positive affect. Contrary to DET, at no age did the proportions or durations of discrete and blended negative expressions differ, and they showed different patterns of developmental change. One is led to either reject or revise DET or else question the adequacy of the Max system

Book ChapterDOI
01 Jan 1993
TL;DR: This paper found that 3-6-month-olds respond to dynamic faces in face-to-face interactions, but not to changes in adult voice, touch or contingency in both live and televised interactions.
Abstract: Results from studies using the Still-Face procedure showed that 3–6-month-olds respond to dynamic faces in face-to-face interactions, but not to changes in adult voice, touch or contingency in both live and televised interactions. Infant visual attention distinguished between normal and still-face periods, while smiling distinguished people from objects, and upright from inverted faces. Results from other paradigms showed that the adult voice and touch can affect infant responding and infants are sensitive to contingency. A complete description of infant’s perceptual capacities requires the use of multiple response measures and consideration of the experimental demands.