scispace - formally typeset
Search or ask a question

Showing papers on "Facial expression published in 1992"


Journal ArticleDOI
TL;DR: The capability of the human visual system with respect to face identification, analysis of facial expressions, and classification based on physical features of the face are discussed.

1,008 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used electrophysiological procedures to make inferences about patterns of regional cortical activation and found that individual differences in baseline measures of frontal asymmetry are associated with dispositional mood, affective reactivity, temperament, and immune function.
Abstract: Research on cerebral asymmetry and the experience and expression of emotion is reviewed. The studies described use electrophysiological procedures to make inferences about patterns of regional cortical activation. Such procedures have sufficient temporal resolution to be used in the study of brief emotional experiences denoted by spontaneous facial expressions. In adults and infants, the experimental arousal of positive, approach-related emotions is associated with selective activation of the left frontal region, while arousal of negative, withdrawal-related emotions is associated with selective activation of the right frontal region. Individual differences in baseline measures of frontal asymmetry are associated with dispositional mood, affective reactivity, temperament, and immune function. These studies suggest that neural systems mediating approach- and withdrawal-related emotion and action are, in part, represented in the left and right frontal regions, respectively, and that individual differences i...

732 citations


Journal ArticleDOI
TL;DR: The evidence on universality in facial expression of emotion, renewed controversy about that evidence, and new findings on cultural differences are reviewed in this paper, where the capability for voluntarily made facial expressions to generate changes in both autonomic and central nervous system activity are discussed, and possible mechanisms by which this could occur are outlined.
Abstract: The evidence on universals in facial expression of emotion, renewed controversy about that evidence, and new findings on cultural differences are reviewed New findings on the capability for voluntarily made facial expressions to generate changes in both autonomic and central nervous system activity are discussed, and possible mechanisms by which this could occur are outlined Finally, new work which has identified how to distinguish the smile of enjoyment from other types of smiling is described

662 citations


Journal ArticleDOI
TL;DR: People universally recognize facial expressions of happiness, sadness, fear, anger, disgust, and perhaps, surprise, suggesting a perceptual mechanism tuned to the facial configuration displaying each emotion.

538 citations


Journal ArticleDOI
01 Dec 1992-Pain
TL;DR: The findings suggest that the 4 actions identified carry the bulk of facial information about pain and provide evidence for the existence of a universal facial expression of pain.
Abstract: A number of facial actions have been found to be associated with pain. However, the consistency with which these actions occur during pain of different types has not been examined. This paper focuses on the consistency of facial expressions during pain induced by several modalities of nociceptive stimulation. Forty-one subjects were exposed to pain induced by electric shock, cold, pressure and ischemia. Facial actions during painful and pain-free periods were measured with the Facial Action Coding System. Four actions showed evidence of a consistent association with pain, increasing in likelihood, intensity or duration across all modalities: brow lowering, tightening and closing of the eye lids and nose wrinkling/upper lip raising. Factor analyses suggested that the facial actions reflected a general factor with a reasonably consistent pattern across modalities which could be combined into a sensitive single measure of pain expression. The findings suggest that the 4 actions identified carry the bulk of facial information about pain. They also provide evidence for the existence of a universal facial expression of pain. Implications of the findings for the measurement of pain expression are discussed.

532 citations


Journal ArticleDOI
TL;DR: The results suggest that depression is associated with an impaired ability to recognize facial displays of emotion and that negative affect was correlated with poorer performance for patients.
Abstract: The facial discrimination tasks described in part I (Erwin et al., 1992) were administered to a sample of 14 patients with depression and 14 normal controls matched for sex (12 women, 2 men) and balanced for age and sociodemographic characteristics. Patients performed more poorly on measures of sensitivity for happy discrimination and specificity for sad discrimination, and had a higher negative bias across tasks. Severity of negative affect was correlated with poorer performance for patients. The results suggest that depression is associated with an impaired ability to recognize facial displays of emotion.

515 citations


Journal ArticleDOI
TL;DR: New work on the nature of smiling shows that it is possible to distinguish the smile when enjoyment is occurring from other types of smiling, and implications for the differences between voluntary and involuntary expression are considered.
Abstract: Evidence on universals in facial expression of emotion and renewed controversy about how to interpret that evidence is discussed. New findings on the capability of voluntary facial action to generate changes in both autonomic and central nervous system activity are presented, as well as a discussion of the possible mechanisms relevant to this phenomenon. Finally, new work on the nature of smiling is reviewed which shows that it is possible to distinguish the smile when enjoyment is occurring from other types of smiling. Implications for the differences between voluntary and involuntary expression are considered.

369 citations


Journal ArticleDOI
TL;DR: The autistic children looked at the adults less and were much more engaged in toy play than the other children during periods when an adult pretended to be hurt, and the autistic children were also less attentive to adults showing fear, although their behavior was not different from the normal children.
Abstract: Attention, facial affect, and behavioral responses to adults showing distress, fear, and discomfort were compared for autistic, mentally retarded, and normal children. The normal and mentally retarded children were very attentive to adults in all 3 situations. In contrast, many of the autistic children appeared to ignore or not notice the adults showing these negative affects. As a group, the autistic children looked at the adults less and were much more engaged in toy play than the other children during periods when an adult pretended to be hurt. The autistic children were also less attentive to adults showing fear, although their behavior was not different from the normal children. Few of the children in any group showed much facial affect in response to these situations. The results are discussed in terms of the importance of affect in the social learning experiences of the young child.

363 citations



Journal ArticleDOI
TL;DR: It was found that schizophrenics and depressives are characterized by different quantitative, qualitative, and temporal patterns of affect-related dysfunctions, and differences in patterns of Affect-related behavioral deficits may reflect Dysfunctions in different underlying psychobiological systems.
Abstract: Twenty-three acute schizophrenics, 21 acute major depressives (Research Diagnostic Criteria), and 15 normal controls participated in a study on facial expression and emotional face recognition. Under clinical conditions, spontaneous facial expression was assessed according to the affective flattening section of the Scale for the Assessment of Negative Symptoms. Under experimental laboratory conditions involuntary (emotion-eliciting interview) and voluntary facial expression (imitation and simulation of six basic emotions) were recorded on videotape, from which a raterbased analysis of intensity or correctness of facial activity was obtained. Emotional face recognition was also assessed under experimental conditions using the same stimulus material. All subjects were assessed twice (within 4 weeks), controlling for change of the psychopathological status in the patient groups. In schizophrenics, neuroleptic drug influence was controlled by random allocation to treatment with either haloperidol or perazine. The main findings were that schizophrenics and depressives are characterized by different quantitative, qualitative, and temporal patterns of affect-related dysfunctions. In particular, schizophrenics demonstrated a trait-like deficit in affect recognition and in their spontaneous and voluntary facial activity, irrespective of medication, drug type and dosage, or extrapyramidal side-effects. In depressives a stable deficit could be demonstrated only in their involuntary expression under emotion-eliciting interview conditions, whereas in the postacute phase a reduction in their voluntary expression became apparent. Differences in patterns of affect-related behavioral deficits may reflect dysfunctions in different underlying psychobiological systems.

287 citations


Journal ArticleDOI
TL;DR: Men and women differed in performance depending on the sex of the facial stimulus, as well as the need for activation tasks that engage emotional processing and can be used during physiologic neuroimaging ("neurobehavioral probes").
Abstract: Facial discrimination tasks (age, happy-neutral, and sad-neutral) were developed to address the need for activation tasks that engage emotional processing and can be used during physiologic neuroimaging ("neurobehavioral probes"). The stimuli pictured professional actors and actresses who had been screened for asymmetric features. In experiment I, same-sex stimuli were used to examine the performance of normal subjects (24 men, 15 women) on the three tasks. Performance was better during the emotion-discrimination tasks than during the age-discrimination task, and males had higher sensitivity scores for the detection of sad emotion. However, experiment II showed that the sex of the stimulus interacts with the sex of the subject. Compared with female subjects, male subjects (n = 10) were selectively less sensitive to sad emotion in female faces. Female subjects (n = 10) were more sensitive overall to emotional expression in male faces than in female faces. Thus, men and women differed in performance depending on the sex of the facial stimulus.

Journal ArticleDOI
TL;DR: Depressed patients were significantly impaired in the recognition of affect in the facial, but not verbal, expressions, and the relevance of the observed perceptual deficit in depressed patients to the pathophysiology and symptomatology of depression is discussed.

Journal ArticleDOI
TL;DR: This article examined how cultures differ in the degree to which they perceive the universal emotions accurately, and found substantial and consistent differences according to the culture of the judges and the emotion portrayed, in terms of stable cultural dimensions that may influence the perception of emotion.
Abstract: Although the universal recognition of facial expressions of emotion is well documented, few studies have examined how cultures differ in the degree to which they perceive the universal emotions accurately. In this study, American and Japanese judges viewed expressions of six universal emotions posed by both Caucasian and Japanese males and females. In addition, all photos met external criteria for validly and reliably portraying the emotions. Subjects judged which emotions were portrayed and how intensely they were expressed. Results indicated substantial and consistent differences according to the culture of the judges and the emotion portrayed. These findings are discussed in terms of stable cultural dimensions that may influence the perception of emotion.

Journal ArticleDOI
TL;DR: It is argued that infants deploy imitation to enrich their understanding of persons and actions and that early imitation is used for communicative purposes and a theoretical bridge is formed between early imitation and the "object concept."
Abstract: Facial imitation was investigated in infants 6 weeks and 2 to 3 months of age. Three findings emerged: (a) early imitation did not vary as a function of familiarity with the model—infants imitated a stranger as well as their own mothers; (b) infants imitated both static facial postures and dynamic facial gestures; and (c) there was no disappearance of facial imitation in the 2- to 3-month age range, contrary to previous reports. Two broad theoretical points are developed. First, a proposal is made about the social and psychological functions that early imitation serves in infants' encounters with people. It is argued that infants deploy imitation to enrich their understanding of persons and actions and that early imitation is used for communicative purposes. Second, a theoretical bridge is formed between early imitation and the “object concept.” The bridge is formed by considering the fundamental role that identity plays in infants' understanding of people and things. One of the psychological functions that early imitation subserves is to identify people. Infants use the nonverbal behavior of people as an identifier of who they are and use imitation as a means of verifying this identity. Data and theory are adduced in favor of viewing early imitation as an act of social cognition.

Journal ArticleDOI
TL;DR: In this paper, a set of 3 experiments using models of simple geometric patterns revealed that acute angles with downward pointing vertices conveyed the meaning of threat and that roundedness conveyed the sense of warmth.
Abstract: Two studies examined the hypothesis that geometric patterns in the facial expressions of anger and happiness provide information that permits observers to recognize the meaning of threat and warmth. A 1st study sought to isolate the configural properties by examining whether large-scale body movements encode affect-related meanings in similar ways. Results indicated that diagonal and angular body patterns convey threat, whereas round body patterns convey warmth. In a 2nd study, a set of 3 experiments using models of simple geometric patterns revealed that acute angles with downward pointing vertices conveyed the meaning of threat and that roundedness conveyed the meaning of warmth. Human facial features exhibit these same geometric properties in displays of anger and happiness

Journal ArticleDOI
TL;DR: Evidence is provided, using a new and unobtrusive manipulation, that facial feedback operates for unpleasant affect to a degree similar to that previously found for pleasant affect.
Abstract: We examined the hypothesis that muscle contractions in the face influence subjective emotional experience Previously, researchers have been critical of experiments designed to test this facial feedback hypothesis, particularly in terms of methodological problems that may lead to demand characteristics In an effort to surmount these methodological problems Strack, Martin, and Stepper (1988) developed an experimental procedure whereby subjects were induced to contract facial muscles involved in the production of an emotional pattern, without being asked to actually simulate an emotion Specifically, subjects were required to hold a pen in their teeth, which unobtrusively creates a contraction of the zygomaticus major muscles, the muscles involved in the production of a human smile This manipulation minimises the likelihood that subjects are able to interpret their zygomaticus contractions as representing a particular emotion, thereby preventing subjects from determining the purpose of the experi

Journal ArticleDOI
TL;DR: Facial electromyographic recordings were first made less than 70 years ago, and the electromyography study of covert facial actions during affect and emotion has less than a 20-year history as discussed by the authors.
Abstract: Surface electromyographic recordings in humans were first made less than 70 years ago, and the electromyographic study of covert facial actions during affect and emotion has less than a 20-year history. Despite the relative youth of facial electromyography, its use in combination with autonomic measures and comprehensive overt facial action coding systems has provided a sensitive and effective armamentarium for investigating emotion and affect-laden information processing. Research over the past decade has demonstrated that facial electromyographic activity varies as a function of the intensity, valence, and sociality of emotional stimuli and shows that facial electromyographic activity is slightly different in deliberately manipulated and spontaneous expressions of emotion. The multiply determined nature of facial actions and expressions, however, has limited the inferences that can be made about the psychological significance of facial electromyographic responses. These limitations have begun to recede ...

Journal ArticleDOI
TL;DR: Parts of face processing in psychiatric patients were investigated in relation to Bruce & Young's (1986) model and it was shown that schizophrenic patients performed at a significantly lower level than non-patient controls on all three tasks, supporting the generalized deficit hypothesis.
Abstract: Functional models of face processing have indicated that dissociations exist between the various processes involved, e.g. between familiar face recognition and matching of unfamiliar faces, and between familiar face recognition and facial expression analysis. These models have been successfully applied to the understanding of the different types of impairment that can exist in neuropsychological patients. In the present study, aspects of face processing in psychiatric patients were investigated in relation to Bruce & Young's (1986) model. Based on this functional model different predictions can be made. We contrast here the impaired expression analysis hypothesis, which is that psychiatric patients would show a deficit in facial expression recognition, but not in facial identity recognition or unfamiliar face matching, with the generalized deficit hypothesis, that patients would be impaired on all tasks. These hypotheses were examined using three forced-choice tasks (facial recognition, facial expression recognition, and unfamiliar face matching) which were presented to schizophrenic and depressed patients, and to non-patient controls. Results showed that schizophrenic patients performed at a significantly lower level than non-patient controls on all three tasks, supporting the generalized deficit hypothesis.

01 Jan 1992
TL;DR: The authors found that more than 80% of males and females mention a female target when asked to name the most emotional person they know, indicating that social pressures tend to facilitate emotionality in mothers as compared with fathers and clinical thinking about gender underscores females' apparent greater access to their emotions.
Abstract: Declarations about how women and men differ emotionally abound in the psychological literature. They can be found in the Parsonian normative construction of family roles, with women described as the "expressive" experts and men as the "instrumental" experts (Parsons & Bales, 1955). They are manifest in measures of gender role identificalion, where emotion items constitute the key components of identification with the feminine and not masculine sex role (Constantinople, 1973). Emotionality also features prominently in the content of gender stereotypes with at least 75% agreement among subjects (both female and male) that the labels "very emotional" and "very aware of feelings of others" were seen to be more characteristic of females than males (Broverman, Vogel, Broverman, Clarkson, & Rosenkrantz, 1972). A converging result emerges strikingly in Shields's (1987) finding that more than 80% of males and females mention a female target when asked to name the most emotional person they know. Additionally. social pressures tend to facilitate emotionality in mothers as compared with fathers (Shields & Koster, 1989). and clinical thinking about gender underscores females' apparent greater access to their emotions (Chodorow, 1980). Perhaps because the belief in gender differences in emotionality is so pervasive and perennial, i t has tended to mask the complexity of defining what i t means to be emotional. That is, how should investigators reach the conclusion that women are more emotional than men or, Marianne LoFrance and Mahzarin Banaji 179

Journal ArticleDOI
TL;DR: Both social facilitation and inhibition of expression occurred on the basis of the emotional stimulus and personal relationship involved, and Strangers had overall inhibitory effects on communication accuracy, whereas friends had facilitative effects on some slides and inhibits on others.
Abstract: Does the presence of others facilitate or inhibit emotional expression? Female "senders" (n = 45) viewed 12 emotionally loaded slides either alone or with another sender while responses were secretly videotaped. In Study 1, 14 "receivers" guessed the type of slide viewed by dyads more accurately (77 = .366). In Study 2,42 receivers viewed 10 senders with friends, 10 with strangers, and 10 alone. One dyad member was covered so that only 1 sender was visible. Analysis revealed significant effects of condition (alone, friend, or stranger; ri = .456), slide type (sexual, scenic, unpleasant, or unusual; rj = .325), and the Condition X Slide Type interaction (1; = .350). Strangers had overall inhibitory effects on communication accuracy, whereas friends had facilitative effects on some slides and inhibitory effects on others. Thus, both social facilitation and inhibition of expression occurred on the basis of the emotional stimulus and personal relationship involved. Ekman and his colleagues have demonstrated that certain primary affects are associated with expressive displays characteristic of the human species (Ekman & Friesen, 1975). It is widely agreed that these displays evolved to serve social functions: to communicate motivational and emotional states relevant to social organization. Also, there is no question that the displays are open to influence from learning and complex cognitive processing. Thus, it is not surprising that the displays are particularly susceptible to social influences. Social functions, therefore, are the sine qua non for the evolution of expressive displays. It is possible, however, to study displays in the relative absence of social influence by the simple expedient of presenting an emotionally eliciting stimulus while the subject is alone and unaware of being observed. It can be argued that expressive behavior in such minimally social situations is more likely to reflect the particular motivational and emotional state elicited by the emotional stimulus than are expressive behaviors made when others are present (i.e., Buck, 1984,1988a; Ekman, 1984). At first glance, it seems paradoxical that if displays evolved to serve social functions, they are shown most clearly in solitude. It is perhaps more accurate to say that solitary displays bear a simpler relationship to the emotional elicitor "X" for two reasons. First, when one is alone, there is relatively little need to use display rules to present a proper image to others (using symbolic expressive behaviors) relative to elicitor X. Second, other persons function as eliciting stimuli themselves, itfhich

Journal ArticleDOI
TL;DR: The performance of the 8- to 13-year-old children was similar to that of adult patients with frontal lobe injuries, which could be taken as evidence that the regions of the frontal lobe involved in the performance of these tasks may not be mature until about 14 years of age.

Journal ArticleDOI
TL;DR: Results obtained with an infant-controlled habituation-recovery procedure showed that infants both discriminated and recognized these expressions when portrayed by several adult female models, and infants spent more time looking at expressions of anger and surprise than at fear expressions.
Abstract: On the assumption that the ability to discriminate facial expressions has adaptive value to infants during early social exchanges, ethologically based theorists have argued that this ability is innate. Guided by this perspective, we investigated the ability of infants, 4-6 months old to recognize and discriminate facial expressions of anger, fear, and surprise. Results obtained with an infant-controlled habituation-recovery procedure showed that infants both discriminated and recognized these expressions when portrayed by several adult female models. In addition, infants spent more time looking at expressions of anger and surprise than at fear expressions. These results suggest that infants can abstract configurations of features that give affective meaning to facial expressions. It is suggested that the differences in habituation to each expression might be the result of their distinct functional signification for the infant.

Journal ArticleDOI
TL;DR: The authors tested whether infant facial expressions selected to fit Max formulas (Izard, 1983) for discrete emotions are recognizable signals of those emotions and found that only 3 of the 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes.
Abstract: Three studies tested whether infant facial expressions selected to fit Max formulas (Izard, 1983) for discrete emotions are recognizable signals of those emotions. Forced-choice emotion judgments (Study 1) and emotion ratings (Study 2) by naive Ss fit Max predictions for slides of infant joy, interest, surprise, and distress. But Max fear, anger, sadness, and disgust expressions in infants were judged as distress or as emotion blends in both studies. Ratings of adult facial expressions (Study 2 only) fit a priori classifications. In Study 3, we coded the facial muscle components of faces shown in Studies 1 and 2 with the Facial Action Coding System (FACS; Ekman & Friesen, 1978) and Baby FACS (Oster & Rosenstein, in press). Only 3 of the 19 Max-specified expressions of discrete negative emotions in infants fit adult prototypes

Journal ArticleDOI
TL;DR: In this paper, the effects of communicative intent and stimulus affectivity on facial electromyogrqphic (EMG) activity were investigated and the results suggest that facial efference can be altered by both affective and communicative processes even when it is too subtle to produce a socially perceptible facial expression.
Abstract: The effects of communicative intent and stimulus affectivity on facial electromyogrqphic (EMG) activity were investigated. Subjects viewed slides of pleasant, neutral, or unpleasant social or nature scenes under no instruction, inhibit-expression instructions, and amplify-expression instructions. Results revealed that facial EMG activity was highest in the amplify and lowest in the inhibit condition; EMG activity over the corrugator supercilii region varied as a function of the affective valence of the stimuli regardless of instructional condition; and facial EMG activity did not differ when subjects were exposed to slides of nature versus social scenes that were matched for rated pleasantness. These results suggest that facial efference can be altered by both affective and communicative processes even when it is too subtle to produce a socially perceptible facial expression.

Journal ArticleDOI
TL;DR: A formulation is proposed that organizes much of the existing data on internalizers and externalizers and yields novel predictions regarding the subpopulation labeled as generalizers.
Abstract: Two important questions bearing on personality processes and individual differences are how do facial expressiveness and sympathetic activation vary as a function of the intensity of an emotional stimulus, and what is the functional mechanism underlying facial expressiveness and sympathetic activation in emotion? A formulation is proposed that is based on 2 propositions: (a) All strong emotions result in some degree of activation of the organism (i.e., principle of stimulus dynamism) and (b) there are individual differences in the gain (amplification) operating on the facial expressive and sympathetic response channels (i.e., principle of individual response uniqueness). This formulation organizes much of the existing data on internalizers and externalizers and yields novel predictions regarding the subpopulation labeled as generalizers.

Journal ArticleDOI
TL;DR: In this article, the performance of children at ages 4, 6, and 8 years was compared on the four types of task most often used in facial expression studies with children, and the relative difficulty of the tasks was also investigated, with the aim of arranging them into a hierarchy of increasing difficulty.
Abstract: The performance of children at ages 4, 6, and 8 years was compared on the four types of task most often used in facial expression studies with children. We examined whether the order of mastery of emotions at different ages was constant across tasks, or alternatively, if it was task-specific. The relative difficulty of the tasks was also investigated, with the aim of arranging them into a hierarchy of increasing difficulty. The four tasks used were situation discrimination, matching discrimination, forced choice labeling, and free labeling. Accuracy was found to increase with age, but the interaction between age, type of task, and emotion was not significant. These results suggest that conclusions about the ordering of specific emotions from least to most difficult at different ages is not task dependent. Nevertheless, a significant interaction found between task and emotion suggests that such conclusions should specify which type of task generated the pattern. A hierarchy of difficulty for the tasks was only partially supported. Performance on the first three tasks was very similar but performance on free labeling was significantly poorer.

Proceedings ArticleDOI
01 Sep 1992
TL;DR: This paper further investigates the method of recognizing the strength of the six basic facial expressions by a neural network and finds the correct recognition ratio was found to be about 90%.
Abstract: Develops an 'Active human interface' that realizes interactive communication between machine (computer and/or robot) and human. The authors investigate the method of machine recognition of human facial expressions and their strength. They deal with the neural network method of recognition of facial expressions. Considering 6 groups of facial expressions, i.e. surprise, fear, disgust, anger, happiness and sadness, they obtain 30 x- and y-coordinates of facial characteristic points representing 3 face components (eyes, eyebrows and mouth). Then they generate the facial position information which is input to the input units of a neural network; the network learning is done by backpropagation algorithm and the recognition test is carried out. For the six basic facial expressions, the correct recognition ratio was found to be about 90%. This paper further investigates the method of recognizing the strength of the six basic facial expressions by a neural network. >

Journal ArticleDOI
TL;DR: It is indicated that infants can discriminate happy and angry affective expressions on the basis of motion information, and that the temporal correspondences unifying these affective events may be affect-specific rhythms.
Abstract: 2 studies were conducted to examine the roles of facial motion and temporal correspondences in the intermodal perception of happy and angry expressive events. 7-month-old infants saw 2 video facial expressions and heard a single vocal expression characteristic of one of the facial expressions. Infants saw either a normally lighted face (fully illuminated condition) or a moving dot display of a face (point light condition). In Study 1, one woman expressed the affects vocally, another woman expressed the affects facially, and what they said also differed. Infants in the point light condition showed a reliable preference for the affectively concordant displays, while infants in the fully illuminated condition showed no preference for the affectively concordant display. In a second study, the visual and vocal displays were produced by a single individual on one occasion and were presented to infants 5 sec out of synchrony. Infants in both conditions looked longer at the affectively concordant displays. The results of the 2 studies indicate that infants can discriminate happy and angry affective expressions on the basis of motion information, and that the temporal correspondences unifying these affective events may be affect-specific rhythms.

Journal ArticleDOI
TL;DR: The authors examined the relation between children's abilities to decode the emotional meanings in facial expressions and tones of voice, and their popularity, locus of control or reinforcement orientation, and academic achievement.
Abstract: The present study examined the relation between children's abilities to decode the emotional meanings in facial expressions and tones of voice, and their popularity, locus of control or reinforcement orientation, and academic achievement. Four hundred fifty-six elementary school children were given tests that measured their abilities to decode emotions in facial expressions and tones of voice. Children who were better at decoding nonverbal emotional information in faces and tones of voice were more popular, more likely to be internally controlled, and more likely to have higher academic achievement scores. The results were interpreted as supporting the importance of nonverbal communication in the academic as well as the social realms.

Journal ArticleDOI
TL;DR: The self-generation method employed as an emotion elicitor was shown to reliably induce emotional reactions and is proposed as a useful technique for the elicitation of various emotional states in the laboratory.