scispace - formally typeset
Search or ask a question

Showing papers on "Facial expression published in 1986"


Journal ArticleDOI
TL;DR: In this article, facial electromyographic (EMG) activity was used to distinguish both the valence and intensity of the affective reaction to the visual stimuli, and independent judges were unable to determine from viewing videotapes of the subjects' facial displays whether a positive or negative stimulus had been presented or whether a mildly or moderately intense stimulus was presented.
Abstract: Physiological measures have traditionally been viewed in social psychology as useful only in assessing general arousal and therefore as incapable of distinguishing between positive and negative affective states. This view is challenged in the present report. Sixteen subjects in a pilot study were exposed briefly to slides and tones that were mildly to moderately evocative of positive and negative affect. Facial electromyographic (EMG) activity differentiated both the valence and intensity of the affective reaction. Moreover, independent judges were unable to determine from viewing videotapes of the subjects' facial displays whether a positive or negative stimulus had been presented or whether a mildly or moderately intense stimulus had been presented. In the full experiment, 28 subjects briefly viewed slides of scenes that were mildly to moderately evocative of positive and negative affect. Again, EMG activity over the brow (corrugator supercilia), eye (orbicularis oculi), and cheek (zygomatic major) muscle regions differentiated the pleasantness and intensity of individuals' affective reactions to the visual stimuli even though visual inspection of the videotapes again indicated that expressions of emotion were not apparent. These results suggest that gradients of EMG activity over the muscles of facial expression can provide objective and continuous probes of affective processes that are too subtle or fleeting to evoke expressions observable under normal conditions of social interaction.

628 citations


Journal ArticleDOI
TL;DR: It is suggested that autistic children have difficulty in recognizing how different expressions of particular emotions are associated with each other, and that this might contribute to their failure to understand the emotional states of other people.
Abstract: Groups of MA-matched autistic, normal and non-autistic retarded children were tested for their ability to choose drawn and photographed facial expressions of emotion to "go with" a person videotaped in gestures, vocalizations and contexts indicative of four emotional states. Although both autistic and control subjects were adept in choosing drawings of non-personal objects to correspond with videotaped cues, the autistic children were markedly impaired in selecting the appropriate faces for the videotaped expressions and contexts. Within the autistic group, the children's performance in this task of emotion recognition was related to MA. It is suggested that autistic children have difficulty in recognizing how different expressions of particular emotions are associated with each other, and that this might contribute to their failure to understand the emotional states of other people.

626 citations


BookDOI
01 Jan 1986
TL;DR: In this article, the authors present a comprehensive overview of the face recognition process, focusing on the memorability of the human face and the influence of race on the performance of face recognition.
Abstract: 1. Introduction.- to aspects of face processing: Ten questions in need of answers.- 2. Perceptual Processes.- Microgenesis of face perception..- Recognition memory transfer between spatial- frequency analyzed faces..- Reaction time measures of feature saliency in a perceptual integration task..- Perception of upside-down faces: An analysis from the viewpoint of cue-saliency..- 3. Memory Processes.- On the memorability of the human face..- Face recognition is not unique. Evidence from individual differences..- Lateral reversal and facial recognition memory: Are right-lookers special?.- Context effects in recognition memory of faces: Some theoretical problems..- 4. Cognitive Processes.- Recognising familiar faces..- Face recognition: More than a feeling of familiarity?.- Getting semantic information from familiar faces..- What happens when a face rings a bell?: The automatic processing of famous faces..- 5. Socio-Cognitive Factors.- Levels of representation and memory for faces..- Formation of facial prototypes..- Stereotyping and face memory..- The influence of race on face recognition..- Faces, prototypes, and additive tree representations..- 6. Cortical Specialisation.- Functional organization of visual neurones processing face identity..- Hemispheric asymmetry in face processing in infancy..- Models of laterality effects in face perception..- Hemispheric asymmetries in face recognition and naming: effects of prior stimulus exposure..- Patterns of cerebral dominance in wholistic and featural stages of facial processing..- Hemispheric differences in the evoked potential to face stimuli..- Cerebral and behavioural asymmetries in the processing of "unusual" faces: A review..- 7. Prosopagnosias.- Current issues on prosopagnosia..- The cognitive psychophysiology of prosopagnosia..- Prosopagnosia: Anatomic and physiologic aspects..- Faces and non-faces in prosopagnosic patients..- Observations on a case of prosopagnosia..- 8. Brain Pathology.- Facial processing in the dementias..- The matching of famous and unknown faces, given either the internal or the external features: A study on patients with unilateral brain lesions..- Face recognition dysfunction and delusional mis identification syndromes (D.M.S.)..- 9. Facial Expressions.- Facial expression processing..- The perception of action versus feeling in facial expression..- Towards the quantification of facial expressions with the use of a mathematic model of the face..- Is the faster processing of expressions of happiness modality-specific?.- Primary stages in single-glance face recognition: Expression and identity..- Affective and cognitive decisions on faces in normals..- 10. Applications and Computer Technology.- Dynamics of facial recall..- The recall and reconstruction of faces: Implications for theory and practice..- An interactive computer system for retrieving faces..- Investigating face recognition with an image processing computer..- Practical face recognition and verification with WISARD..- 11. An Overview.- Plenary session. An overview. Complementary approaches to common problems in face recognition..- 12. References.- Addresses of Principal Authors.

473 citations


Journal ArticleDOI
TL;DR: The results were found to be consistent with results from a previous, related study in which the same subjects had chosen drawn or photographed faces to indicate their judgements of the same videotapes of emotional expression.
Abstract: Autistic and matched non-autistic retarded children were selected for their ability to recognize the correspondence between schematic drawings and videotaped scenes involving people. The subjects of both groups were able to choose schematic drawings of gestures for a person's gestures of emotion enacted on videotape. However, the autistic children were significantly impaired in choosing which of the drawings of gestures should ‘go with’ videotaped vocalizations and facial expressions characteristic of four emotional states. The results were found to be consistent with results from a previous, related study in which the same subjects had chosen drawn or photographed faces to indicate their judgements of the same videotapes of emotional expression. It is suggested that these findings reflect an important aspect of autistic children's social disability.

346 citations



Journal ArticleDOI
TL;DR: Findings support the notion that the right cerebral hemisphere is dominant for expressing and perceiving facial emotion in patients with unilateral cerebrovascular pathology.

241 citations


Journal ArticleDOI
TL;DR: This paper found that infants are influenced by the emotional expressions of a much broader group of adults than has previously been recognized, and that infants would reference an adult other than their mothers and would make use of the affective information obtained.
Abstract: Previous studies have demonstrated that 1 -year-old infants look toward their mothers' facial expressions and use the emotional information conveyed. In this study, 46 1-year-olds were confronted with an unusual toy in a context where an experimenter familiar to the infants posed either happy or fearful expressions and where their mothers were present but did not provide facial signals. Most of the infants (83%) referenced the familiarized stranger. Once the adult's facial signals were noted, the infant's instrumental behaviors and expressive responses to the toy were influenced in the direction of the affective valence of the adult's expression. The results indicate that infants may be influenced by the emotional expressions of a much broader group of adults than has previously been recognized. In this study we investigated whether infants would reference an adult other than their mothers and would make use of the affective information obtained. Emotional expressions of others provide important information about environmental events. Recent studies have shown that preverbal infants engage in "social referencing"—they look to others when confronted with a variety of events and use the emotional reactions of others to regulate their own behavior. The infants' mothers provided the emotional signals in these studies, by giving either prototypical facial expressions (Klinnert, 1984; Sorce, Emde, Campos, & Klinnert, 1985) or vocal signals (Svejda & Campos, in press) or by controlling the positive affect in multichannele d responses (Feinman & Lewis, 1983). Although the purpose of these initial studies was to demonstrate that infant behavior is responsive to a social signaling process, the studies also reflect a basic assumption about infants' social referencing: If an infant is responsive to anyone's affective signals, the person most likely to influence the infant is his or her mother. The central role of mothers in infants' social referencing was initially suggested by Campos and Stenberg (1981), who postulated that "mother becomes the target of social referencing" (p. 295). In fact, the term maternal referencing was sometimes used interchangeably with the term social referenc

201 citations


Journal ArticleDOI
TL;DR: It appears that the role of facial expression is not sufficient to account for the disproportionate effect of inversion upon face recognition, and this effect cannot be explained in terms of the extra familiarity of the task or the use of identical photographs at test.

188 citations


Book
01 Jan 1986
TL;DR: In this paper, the authors discuss the relationship between nonverbal and verbal communication in the context of the interview process and the evaluation of a person's ability to understand and interpret verbal and nonverbal communication.
Abstract: "Summary" ends each chapter. I.NONVERBAL COMMUNICATION. 1. The Nature of Nonverbal Communication. The Functional Importance of Nonverbal Communication. Definitional Perspective: Nonverbal and Verbal Communication. The Functions of Nonverbal Cues. Communicating Nonverbally in Specific Contexts. 2. Facial Expressions. The Face as the Most Important Source of Emotional Information. The Face as a Means of Identifying Individuals. The Deceptive Face: How to Recognize It and Guard Against It. Measuring Sensitivity to Facial Expressions. The Judgmental Process. Developing Sensitivity to Facial Expressions: Training Program. 3. Eye Behaviors. The Language of the Eyes. The Functions of Eye Behaviors. Using the Communicative Potential of Eye Behaviors. 4. Bodily Communication. The Nature of Bodily Cues. Gestures versus Postures. Major Communicative Functions of Bodily Cues. Functional and Dysfunctional Uses of Bodily Cues. 5. Proxemic Communication. The Proximate Environment. The Communicative Functions of Proxemics. The Effects of Violating Proxemic Norms and Expectations. 6. Tactile Communication. The Nature of Touch. Touching Norms. The Semantics of Touch. The Communicative Functions of Touch. 7. Personal Appearance. Features of Physical Attractiveness. Body Image. Effects of Personal Appearance. The Nature of Artifactual Communication. The Communicative Functions of Personal Appearance. 8. Vocalic Communication. The Semantics of Sound. The Communicative Functions of Vocal Cues. Developing the Ability to Encode and Decode Vocalic Messages. II.DEVELOPING THE SUCCESSFUL COMMUNICATOR. 9. Impression Formation. The Importance of Nonverbal Cues in Interpersonal Perception. Defining Components of Impression Formation. Principles of Impression Formation. 10. Impression Management. The Nature of Impression Management. The Impression Management Process. The Impression Manager in Action. 11. Selling Yourself Nonverbally. Dimensions of Credibility. Illustrating the Impact of Nonverbal Cues on Credibility. Developing Personal Credibility. Monitoring the Communicator's Nonverbal Cues. 12. Detecting Deception. Nonverbal Indicators of Deception. Nonverbal Profile of the Deceptive Communicator-Type I. Nonverbal Profile of the Deceptive Communicator-Type II. The Deception Process. 13. Communicating Consistently. The Nature of Inconsistent Messages. Decoding Inconsistent Messages. Reasons for Inconsistent Messages. Guidelines for Communicating Consistently. III.SUCCESSFUL COMMUNICATION IN APPLIED SETTINGS. 14. Nonverbal Determinants of Successful Interviews. The Job Interview. The Counseling Interview. The Interviewer's Perspective. 15. Female-Male Interaction. Sex-Role Stereotyping. Differences in Nonverbal Communication of Women and Men. 16. Successful Intercultural Communication. Cross-Cultural Similarities in Nonverbal Communication. Cross-Cultural Differences in Nonverbal Communication. Communicating Nonverbally with the Japanese and the Arabs. Guidelines for More Successful Intercultural Communication. 17. Physican/Patient Interaction. Potential Problems in Physician-Patient Communication Interaction. The Functional Importance of Nonverbal Communication in the Medical Setting. Actual Features of Physician-Patient Nonverbal Communication. Desired Features of Physician-Patient Nonverbal Communication. 18. Courtroom Interaction. The Functional Importance of Nonverbal Communication in the Courtroom. The Functions of Nonverbal Communication in the Courtroom. The Logistics of Measuring Impressions. Nonverbal Impression Formation and Management in the Courtroom. 19. The Communicative Impact of Micro-environmental Variables. Context as Communication. The Classroom Environment. The Conference Room Environment. The Office Environment. The Fast-Food Restaurant Environment. Appendix. Index.

177 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined whether spontaneous facial expressions provide observers with sufficient information to distinguish accurately which of 7 affective states (6 emotional and 1 neutral) is being experienced by another person.
Abstract: Examined whether spontaneous facial expressions provide observers with sufficient information to distinguish accurately which of 7 affective states (6 emotional and 1 neutral) is being experienced by another person. Six undergraduate senders' facial expressions were covertly videotaped as they watched emotionally loaded slides. After each slide, senders nominated the emotions term that best described their affective reaction and also rated the pleasantness and strength of that reaction. Similar nominations of emotion terms and ratings were later made by 53 undergraduate receivers who viewed the senders' videotaped facial expression. The central measure of communication accuracy was the match between senders' and receivers' emotion nominations. Overall accuracy was significantly greater than chance, although it was not impressive in absolute terms. Only happy, angry, and disgusted expressions were recognized at above-chance rates, whereas surprised expressions were recognized at rates that were significantly worse than chance. Female Ss were significantly better senders than were male Ss. Although neither sex was found to be better at receiving facial expressions, female Ss were better receivers of female senders' expressions than of male senders' expressions. Female senders' neutral and surprised expressions were more accurately recognized than were those of male senders. The only sex difference found for decoding emotions was a tendency for male Ss to be more accurate at recognizing anger. (25 ref)

167 citations



Journal ArticleDOI
TL;DR: For instance, this article found that 2-, 3-, and 4-year-olds interpret facial expressions in terms of dimensions of pleasure and arousal, while adults interpret them in a different way.
Abstract: Dimensions of the meaning attributed to facial expressions of emotion were studied in preschoolers (nineteen 4-year-olds, twenty-one 3-year-olds, and thirty-eight 2-year-olds) plus thirty adults. Subjects indicated the similarity or dissimilarity between different emotions by placing photographs of emotional facial expressions into preordained numbers of groups. For each age group, multidimensional scaling of the pairwise similarities yielded a two-dimensiona l structure in which the expressions fell in a roughly similar circular order. Its dimensions could be interpreted as degree of pleasure and degree of arousal. Four-year-olds and adults also produced a third dimension, tentatively interpreted as assertiveness versus taken aback. As adults, we are highly skilled at reading facial expressions of emotion. We can interpret expressions in terms of such basic categories as anger, fear, happiness, surprise, and the like. We can also interpret expressions in terms of such basic bipolar dimensions as pleasure-displeasure and arousal-sleepiness. But in what terms do children at various ages interpret facial expressions? Does the message derived from a particular expression vary with age? Knowing how children of different ages interpret facial expressions should give us an important clue about the development of the skills involved in the interpretation of emotion in general. In this article, we examine one aspect of this issue: whether 2-, 3-, and 4-year-olds interpret facial expressions in terms of dimensions of pleasure and arousal. Degree of pleasure and degree of arousal are clearly the major dimensions (although not the only dimensions) underlying the way in which adults interpret emotions—their own and those of others. Research from several domains supports this conclusion (see Dittman, 1972). One source of evidence is multidimensionalscaling studies of the similarity perceived between emotions expressed in the face (Abelson & Sermat, 1962; Royal & Hays,

Journal ArticleDOI
TL;DR: This article reviewed the research literature on the perception of emotion in facial expressions from this vantage point, and four studies testing predictions from this thesis are reported. But these seemingly disparate conclusions can be reconciled when natural language concepts of emotion are thought of as overlapping and fuzzy, rather than as mutually exclusive and properly defined.
Abstract: Research on how well one person can recognize the emotion expressed in another person's face has resulted in controversy: accuracy versus inaccuracy; discrete categories versus dimensions versus structural models of emotion. These seemingly disparate conclusions can be reconciled when natural language concepts of emotion (“happiness,” “anger,” “tear,” “sadness,” etc.) are thought of as overlapping and fuzzy, rather than as mutually exclusive and properly defined. The research literature on the perception of emotion in facial expressions is reviewed from this vantage point, and four studies testing predictions from this thesis are reported. When rating the degree to which either posed or spontaneous facial expressions exemplify emotion categories, subjects produced reliably graded responses and indicated that individual (even prototypical) expressions belong to more than one category. The graded “prototypicality” ratings (1) predicted the probability with which the expression was said to be a member of the...

Journal ArticleDOI
TL;DR: The authors found that viewers describe the leader's expressive behavior in ways consistent with the objective criteria used to select the stimuli, and the viewer's description of these displays was not influenced by their prior attitudes toward the leader, even though he is well known and controversial.

Journal ArticleDOI
TL;DR: In this article, a telephone survey was conducted to report the most recent situation that evoked strong emotional feelings in them and to describe the pattern of their reactions, and the majority of the situations reported had evoked negative emotions.
Abstract: As part of a telephone survey, respondents were asked to report the most recent situation that evoked strong emotional feelings in them and to describe the pattern of their reactions. The majority of the situations reported had evoked negative emotions. Most of the emotion-antecedent events are connected to relationships with family and friends or to work-related situations. Only happiness and anger are reported as relatively pure feeling states; most others are emotion blends, with anger/sadness and sadness/fear occurring most frequently. Facial expression changes as well as heart and muscle symptoms are reported as the most frequent reactions across all emotions, whereas other nonverbal and physiological reactions are more specific for particular emotions. By the use of factor analysis, response patterns across various components of emotional state, including affect control, are explored.


Journal Article
TL;DR: This study investigated whether infants would reference an adult other than their mothers and would make use of the affective information obtained, and indicated that infants may be influenced by the emotional expressions of a much broader group of adults than has previously been recognized.
Abstract: Previous studies have demonstrated that 1 -year-old infants look toward their mothers' facial expressions and use the emotional information conveyed. In this study, 46 1-year-olds were confronted with an unusual toy in a context where an experimenter familiar to the infants posed either happy or fearful expressions and where their mothers were present but did not provide facial signals. Most of the infants (83%) referenced the familiarized stranger. Once the adult's facial signals were noted, the infant's instrumental behaviors and expressive responses to the toy were influenced in the direction of the affective valence of the adult's expression. The results indicate that infants may be influenced by the emotional expressions of a much broader group of adults than has previously been recognized. In this study we investigated whether infants would reference an adult other than their mothers and would make use of the affective information obtained. Emotional expressions of others provide important information about environmental events. Recent studies have shown that preverbal infants engage in "social referencing"—they look to others when confronted with a variety of events and use the emotional reactions of others to regulate their own behavior. The infants' mothers provided the emotional signals in these studies, by giving either prototypical facial expressions (Klinnert, 1984; Sorce, Emde, Campos, & Klinnert, 1985) or vocal signals (Svejda & Campos, in press) or by controlling the positive affect in multichannele d responses (Feinman & Lewis, 1983). Although the purpose of these initial studies was to demonstrate that infant behavior is responsive to a social signaling process, the studies also reflect a basic assumption about infants' social referencing: If an infant is responsive to anyone's affective signals, the person most likely to influence the infant is his or her mother. The central role of mothers in infants' social referencing was initially suggested by Campos and Stenberg (1981), who postulated that "mother becomes the target of social referencing" (p. 295). In fact, the term maternal referencing was sometimes used interchangeably with the term social referenc

Journal ArticleDOI
TL;DR: Data indicate that nonverbal expression yields information about the response to noxious stimulation that is non-redundant with self-report, and there was a substantial direct relation between observer judgments of distress and discrete, pain-related facial actions.
Abstract: We provided a microanalytic description of facial reactions to a series of painful and nonpainful electric shocks and examined the impact of these as discrete facial cues for observer judgments of acute pain. Thirty female volunteers were videotaped and reported their discomfort in response to electric shocks after earlier exposure to one of three social influence conditions: a tolerant model, an intolerant model, or neutral peer presence. We coded the videotapes for facial activity using the Facial Action Coding System (Ekman & Friesen, 1978b), and peer judges rated them for painful discomfort. Subjects exposed to a tolerant model reported no more discomfort than did subjects exposed to an intolerant model, despite receiving more intense levels of shock, but were judged by observers to be in more pain. Analyses of facial activity yielded consistent findings: Tolerant-model subjects, though reporting discomfort equivalent to that reported in other groups, displayed more pain-related facial activity (brow lowering, narrowing of the eye aperture from below, raising the upper lip, and blinking). There was a substantial direct relation between observer judgments of distress and discrete, pain-related facial actions (mean multiple R = .74 for the various shock levels rated). These data indicate that nonverbal expression yields information about the response to noxious stimulation that is non-redundant with self-report.

Journal ArticleDOI
Ulf Dimberg1
TL;DR: The present data are consistent with the theory that the face constitutes an emotional 'readout/output-system' and that subjects exposed to different facial expressions react spontaneously with different facial EMG response patterns.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the association between newscasters' facial expressions and the voting behavior of viewers and found that voters who regularly watched the newscaster who exhibited the biased facial expressions were significantly more likely to vote for the candidate that newscast had smiled upon.
Abstract: Two studies were conducted to examine the association between newscasters' facial expressions and the voting behavior of viewers. Study 1 examined the facial expressions exhibited by network newscasters while referring to the 1984 presidential candidates prior to the election. Results indicated that one of the three newscasters exhibited significantly more positive facial expressions when referring to Reagan than when referring to Mondale. Study 2 consisted of a telephone survey conducted to determine whether voting behavior was associated with the nightly news program watched. It was found that voters who regularly watched the newscaster who exhibited the biased facial expressions were significantly more likely to vote for the candidate that newscaster had smiled upon. Discussion considered possible explanations for, and implications of, this association between biases in newscasters' facial expressions and viewers' voting behavior.

Journal ArticleDOI
TL;DR: This paper found that children between 4 and 7 years of age heard stories describing social dominance interactions and chose photographs of adults who looked like the dominant characters described in the stories, and the results confirmed predictions and indicated that human nonverbal dominance signaling may be patterned after that of other species.
Abstract: KEATING, CAROLINE F., and BAI, DINA L. Children's Attributions of Social Dominance from Facial Cues. CHILD DEVELOPMENT, 1986, 57, 1269-1276. Ethological reports of animal dominance signals suggested that certain human brow and mouth gestures would influence the attributions of social dominance made by children. Stimulus photographs depicting adults with lowered brow expressions or without smiles were hypothesized to appear dominant relative to photographs showing adults with raised-brow expressions or with smiles, respectively. In addition, the cross-species record suggested that faces with physiognomic characteristics indicative of physical maturity would also look dominant. In tests of these hypotheses, children between 4 and 7 years of age heard stories describing social dominance interactions and chose photographs of adults who looked like the dominant characters described in the stories. The results confirmed predictions and indicated that human nonverbal dominance signaling may be patterned after that of other species.


Journal ArticleDOI
Francis T. McAndrew1
TL;DR: For instance, the authors found that females were better than males at identifying surprise and fear, especially at the longer exposure times, and the ability to identify anger was strongly affected by both the sex and cultural background of the subject.
Abstract: Forty American (20 males, 20 females) and 31 Malaysian (20 males, II females) college students responded to 60 tachistoscopic presentations of photographs of facial expressions by judging the gender and the emotional expression of each face. The duration of exposure times ranged from 3 msec. to 800 msec. Stable recognition thresholds for most emotional expressions were established by 12 or 25 msec., with fear requiring 300 msec. to be recognized by each group of subjects. Happiness and sadness were the most accurately identified emotions, and anger and fear were the most difficult for subjects to recognize. Females were better than males at identifying surprise and fear, especially at the longer exposure times, and the ability to identify anger was strongly affected by both the sex and cultural background of the subject. Although there were several instances in which Malaysian and American subjects differed, overall accuracy of recognition and perceptual thresholds were not strongly related to differences...

Journal ArticleDOI
01 Jan 1986
TL;DR: From longitudinal observations on depressed patients it emerged that individual-specific associations have to be taken into account for the relationship between expressive behavior and mood changes, and recovered schizophrenic patients exhibited a dissociation of these elements.
Abstract: Nonverbal behavior, especially facial expression, appears as one of the most important means for communicating affective states. Studies on groups of psychiatric patients and control subjects are reported in which nonverbal behavior is analyzed from videotaped dialogues. Using a quantitative approach, results on facial behavior, speech, and gaze are described, which shed light on the expressive and communicative functions of nonverbal behavior. From longitudinal observations on depressed patients it emerged that individual-specific associations have to be taken into account for the relationship between expressive behavior and mood changes. The predominance of facial behavior in the speaker role of an individual found in patients and control groups points to the integrated communicative function of the verbal and nonverbal elements. However, recovered schizophrenic patients exhibited a dissociation of these elements. Implications for our understanding of nonverbal communications are discussed.

Book ChapterDOI
01 Jan 1986
TL;DR: Feyereisen et al. as mentioned in this paper found that the impairments of facial expression comprehension could be related to other perceptual disorders in the visual modality or to amodal disorders of emotional processing.
Abstract: Like the comprehension of verbal, written, or pictorial material, the processing of facial expressions of emotions involves several cognitive operations (Frijda, 1969). This behavior may be disrupted by brain lesions in association with various disorders and, accordingly, diverse interpretations have been given to the impairments of facial expression comprehension (Feyereisen, in press): for example, the deficit could be related to other perceptual disorders in the visual modality or to amodal disorders of emotional processing.


Journal ArticleDOI
TL;DR: Emotional facial expression was examined in light of its presumed right cerebral hemisphere mediation and evidence for right hemisphere disorganization in major depression.

Journal ArticleDOI
TL;DR: Ekman, P., Friesen, W. V., and Ellsworth, P. as discussed by the authors studied the role of perceptual salience in the development of analysis and synthesis processes.
Abstract: ion of invariant face expressions in infancy. Child Development, 53, 1008-1015. Cunningham, J. G., & Odom, R. D. (1978). The role of perceptual salience in the development of analysis and synthesis processes. Child Development, 49, 815-823. Ekman, P., Friesen, W. V., & Ellsworth, P. (1972). Emotion in the human face. New York: Perga-

Journal ArticleDOI
TL;DR: In this article, subjects engaged in happy, sad, angry, and neutral imagery, and voluntarily posed happy and sad facial expressions while facial muscle activity (brow, cheek, and mouth regions) and autonomic activity (skin resistance and heart period) were recorded.
Abstract: Much research on emotional facial expression employs posed expressions and expressive subjects. To test the generalizability of this research to more spontaneous expressions of both expressive and nonexpressive posers, subjects engaged in happy, sad, angry, and neutral imagery, and voluntarily posed happy, sad, and angry facial expressions while facial muscle activity (brow, cheek, and mouth regions) and autonomic activity (skin resistance and heart period) were recorded. Subjects were classified as expressive or nonexpressive on the basis of the intensity of their posed expressions. The posed and imagery-induced expressions were similar, but not identical. Brow activity present in the imagery-induced sad expressions was weak or absent in the posed ones. Both nonexpressive and expressive subjects demonstrated similar heart rate acceleration during emotional imagery and demonstrated similar posed and imagery-induced happy expressions, but nonexpressive subjects showed little facial activity during both their posed and imagery-induced sad and angry expressions. The implications of these findings are discussed.

DOI
01 Jul 1986
TL;DR: If the subjects were stimulated with an emotogenic stimulus during the direct performance of the behavioral patterns of another emotion, they confessed to have the feeling corresponding to the mimicked emotion, and not to the emotion belonging to the emotogenic stimuli.
Abstract: This paper is devoted to the study of the relationship between the subjective component (feelings) and the behavioral aspect of emotions. The following emotions were studied: fear-anxiety, anger-aggression, joy-laughter, love-eroticism, love-tenderness, and sadness-tears. The observations were performed with three different groups of people: patients with anxiety neurosis, students under hypnosis, and drama students. Each emotion was characterized by a specific set of reactions in the respiratory pattern, heart activity, muscular activity, and facial expression. The feelings were correlated with the behavioral patterns and each time the behavioral patterns were interfered with a concomitant modification of the subjectivity component was observed. The direct performance of the behavioral emotional patterns in the absence of the emotogenic stimulus produced the feeling corresponding to the mimicked emotion. If the subjects were stimulated with an emotogenic stimulus during the direct performance of the behavioral patterns of another emotion, they confessed to have the feeling corresponding to the mimicked emotion, and not to the emotion belonging to the emotogenic stimulus. The role played by the feedback from the effector organs in the determination of the subjective emotional states is discussed.