scispace - formally typeset
Open AccessJournal ArticleDOI

Happy mouth and sad eyes: scanning emotional facial expressions.

Hedwig Eisenbarth, +1 more
- 01 Aug 2011 - 
- Vol. 11, Iss: 4, pp 860-865
Reads0
Chats0
TLDR
Eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions, and results confirm the relevance of the eyes and mouth in emotional decoding, but they demonstrate that not all facial expressions with different emotional content are decoded equally.
Abstract
There is evidence that specific regions of the face such as the eyes are particularly relevant for the decoding of emotional expressions, but it has not been examined whether scan paths of observers vary for facial expressions with different emotional content. In this study, eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions. Locations of fixations and their durations were recorded, and a dominance ratio (i.e., eyes and mouth relative to the rest of the face) was calculated. Across all emotional expressions, initial fixations were most frequently directed to either the eyes or the mouth. Especially in sad facial expressions, participants more frequently issued the initial fixation to the eyes compared with all other expressions. In happy facial expressions, participants fixated the mouth region for a longer time across all trials. For fearful and neutral facial expressions, the dominance ratio indicated that both the eyes and mouth are equally important. However, in sad and angry facial expressions, the eyes received more attention than the mouth. These results confirm the relevance of the eyes and mouth in emotional decoding, but they also demonstrate that not all facial expressions with different emotional content are decoded equally. Our data suggest that people look at regions that are most characteristic for each emotion.

read more

Content maybe subject to copyright    Report

Happy Mouth and Sad Eyes: Scanning Emotional Facial Expressions
Hedwig Eisenbarth
University of Regensburg
Georg W. Alpers
University of Mannheim and University of Wu¨rzburg
There is evidence that specific regions of the face such as the eyes are particularly relevant for the
decoding of emotional expressions, but it has not been examined whether scan paths of observers vary
for facial expressions with different emotional content. In this study, eye-tracking was used to monitor
scanning behavior of healthy participants while looking at different facial expressions. Locations of
fixations and their durations were recorded, and a dominance ratio (i.e., eyes and mouth relative to the
rest of the face) was calculated. Across all emotional expressions, initial fixations were most frequently
directed to either the eyes or the mouth. Especially in sad facial expressions, participants more frequently
issued the initial fixation to the eyes compared with all other expressions. In happy facial expressions,
participants fixated the mouth region for a longer time across all trials. For fearful and neutral facial
expressions, the dominance ratio indicated that both the eyes and mouth are equally important. However,
in sad and angry facial expressions, the eyes received more attention than the mouth. These results
confirm the relevance of the eyes and mouth in emotional decoding, but they also demonstrate that not
all facial expressions with different emotional content are decoded equally. Our data suggest that people
look at regions that are most characteristic for each emotion.
Keywords: emotion, facial expressions, eye-tracking, scan path
Facial expressions contain information relevant for social be-
havior. Thus, the processing of facial information is optimized
(e.g., fast and automatic processing; Dimberg, Thunberg, &
Grunedal, 2002) and is carried out by specialized brain regions
(e.g., fusiform face area; Kanwisher, Mcdermott, & Chun, 1997).
Although the exact nature of the specificity of facial processing is
still being discussed (Hanson & Halchenko, 2008), preferential
processing of emotional facial expressions has been well docu-
mented (Alpers & Gerdes, 2007; Pessoa, Kastner, & Ungerleider,
2002; Vuilleumier, Armony, Driver, & Dolan, 2001).
Most individuals are very effective in decoding emotional facial
expressions; several studies using categorization tasks have dem-
onstrated that happy faces are categorized nearly perfectly,
whereas fearful facial expressions are classified slightly less per-
fectly (Calvo & Lundqvist, 2008; Calvo & Nummenmaa, 2009).
The error rate increases as the number of emotional expressions
presented in one experiment increases (Calvo & Lundqvist, 2008;
Eisenbarth, Alpers, Segre`, Calogero, & Angrilli, 2008; Kreklewetz
& Roesch, 2005) and is, to some extent, dependent on culture
(Jack, Blais, Scheepers, Schyns, & Caldara, 2009). When misclas-
sifications occur, they are often due to characteristic mix-ups.
Fearful and surprised facial expressions are misclassified in ap-
proximately 10% of the trials and are classified as angry or
disgusted facial expressions, and vice versa. An explanation for
these mix-ups could be that those emotional expressions contain
primary information about the emotional content in the eye region.
Others could represent disgusted and fearful expressions that con-
tain relevant information in regions like the nose or the forehead
(see Jack et al., 2009).
The fact that complex information needs to be integrated in order
to correctly classify emotional expressions becomes evident when the
performance of healthy participants is compared with schizophrenic,
autistic, psychopathic, or depressed patients; these patients are signif-
icantly less accurate in decoding. Schizophrenic patients score lower
in decoding sad and angry expressions (Bediou et al., 2005; Streit,
Wolwer, & Gaebel, 1997), autistic patients mainly in sad facial
expressions (Boraston, Blakemore, Chilvers, & Skuse, 2007), psycho-
pathic patients in negative facial expressions (Eisenbarth et al., 2008;
Hastings, Tangney, & Stuewig, 2008; Kosson, Suchy, Mayer, &
Libby, 2002), and depressed patients in happy facial expressions
(Joormann & Gotlib, 2006). This impairment may result from the fact
that these patients process stimuli differently compared to healthy
controls. A crucial part of such a processing difference may begin at
the behavioral level, that is, in different ways of examining faces.
Indeed, whereas healthy individuals mainly examine the eyes and
mouth when looking at facial expressions (Henderson, Williams, &
Falk, 2005; Yarbus, 1967), scan paths reveal that the patient groups
listed above often show a deviant scan path with no preference for a
specific facial area (Dalton et al., 2005; Hernandez et al., 2009;
Loughland, Williams, & Gordon, 2002; Streit et al., 1997).
Similar to these patient groups, certain circumscribed brain
lesions have also been shown to be accompanied by impaired
Hedwig Eisenbarth, Department of Forensic Psychiatry and Psychology,
University of Regensburg, Regensburg, Germany; Georg W. Alpers, Chair
of Clinical and Biological Psychology, School of Social Sciences, Univer-
sity of Mannheim, Mannheim, Germany and University of Wu¨rzburg,
Wu¨rzburg, Germany.
The first author was funded by a scholarship from the Konrad-
Adenauer-Foundation. We thank Kartin Blumenauer for helpful comments
on the manuscript.
Correspondence concerning this article should be addressed to Dr.
Hedwig Eisenbarth, University of Regensburg, Department of Forensic
Psychiatry and Psychotherapy, Universita¨tsstrasse 84, D-93053 Regens-
burg, Germany. E-mail: hedwig.eisenbarth@medbo.de
Emotion © 2011 American Psychological Association
2011, Vol. 11, No. 4, 860 865 1528-3542/11/$12.00 DOI: 10.1037/a0022758
860

decoding accuracy. For example, a patient who suffered circum-
scribed amygdala damage, S.M., evidenced a specific deficit in
decoding fear expressions (Adolphs et al., 2005). This case is
particularly interesting because recent data show that the patient’s
impairment when judging emotions is related to an inability to
make normal use of information obtained through the eye region of
a face. Interestingly, this defect was traced back to a lack of
spontaneous fixation of the eyes during free viewing of faces
(Adolphs et al., 2005).
One method to examine attentional allocation is by monitoring
eye gaze. Eye movements and spatial attention are inextricably
linked (Engbert & Kliegl, 2003; Jonides, 1981; Klein, 2004; Riz-
zolatti, Riggio, Dascola, & Umilta, 1987; Smith, Rorden, & Jack-
son, 2004), and people usually do not engage in an effortful
dissociation when they are free to move their eyes (Findlay &
Gilchrist, 1998; Rayner, 1998). Importantly, even the first saccade
can be guided by global information about a scene background or
setting glimpsed at picture onset (Rayner & Pollatsek, 1992; Rous-
selet, Joubert, & Fabre-Thorpe, 2005). Viewing emotional pictures
clearly results in different scan paths compared with neutral scenes
(Alpers, 2008; Nummenmaa, Hyona, & Calvo, 2006). A direct link
between scan paths and emotional response has been documented
by the emotionality experienced when individuals view chimeric
facial expressions (Butler et al., 2005) and in patient groups, such
as socially anxious individuals, in which the gaze directed to the
eyes is related to higher physiological responding (Wieser, Pauli,
Alpers, & Mu¨hlberger, 2009). The relevance of facial scan paths to
social interaction has been highlighted by the observation that
stigmatization of people with facial deformities corresponds with
deviant scan paths compared with normal faces (Meyer-Marcotty,
Alpers, Gerdes, & Stellzig-Eisenhauer, 2010; Meyer-Marcotty,
Gerdes, Stellzig-Eisenhauer, & Alpers, in press).
Because most of the studies reporting scan path data refer to
stimuli with neutral facial expressions, the aim of the present study
was to examine scan paths of healthy individuals while they
examined different facial expressions representing basic emotions
and to investigate whether there are emotion-specific scan paths.
Method
Participants
Thirty-six psychology students (20 female, 16 male) were re-
cruited at the University of Wu¨rzburg and received course credit
for their participation. Their mean age was 22.11 years (SD
3.81; range: 18 to 33), all were native German speakers, 32 of the
participants were right-handers, 3 were left-handers, and 1 was
ambidextrous. Fifteen participants had corrected nearsightedness
(eight using contact lenses), and 21 had normal vision. The mean
score of the Chimeric Faces Test (CFT; Levy, Heller, Banich, &
Burton, 1983) was M ⫽⫺0.21 (SD 0.40; range: 1.00 to 0.81),
which corresponds to the normal left-bias for emotion detection
literature. The mean score of the Social Phobia and Anxiety
Inventory (Turner, Beidel, Dancu, & Stanley, 1989) was M 2.07
(SD 0.72; range: 0.47 to 3.77), and the mean score of the Trait
Anxiety Inventory (Spielberger, Gorsuch, Lushene, Vagg, &
Jacobs, 1983) was M 38.43 (SD 9.16; range: 22 to 65), both
within the normal range. All participants gave written informed
consent.
Material and Apparatus
Stimuli were chosen from the Karolinska Directed Emotional
Faces set (Lundqvist, Flykt, & Ohman, 1998), which has proven to
reliably evoke specific emotions (Goeleven, De Raedt, Leyman, &
Verschuere, 2008) with relatively naturalistic emotional expres-
sions (Adolph & Alpers, 2010). Eight female and eight male
actors, each depicting afraid, angry, happy, neutral and sad emo-
tional expressions, were included in a rating task (see Figure 1).
Trials consisted of a picture presentation for 2,500 ms, followed by
a valence rating display (“How positive or negative is this picture
for you?”) (scale ranging from 4to 4) and a subsequent
arousal rating display (“How emotional arousing is this picture for
you?”) (scale ranging from 1 to 9). Ratings were done by key
presses on a prepared keyboard.
Eye movements were recorded by a monocular video-based,
high-speed tracking system with 1250 Hz, (iView X Hi-Speed,
SMI, Berlin, Germany). Participants were seated in front of the
computer screen, the chin was placed on a chin rest to view the
computer screen through the mirror-glass, which was used to
reflect the eye, and the infrared light point tracked the eye
movements with the camera placed above (for more details, see:
Alpers, 2008). Location, time, and duration of all fixations were
analyzed. Eight different areas of interest (AOI) were defined:
forehead, left eye, right eye, left cheek, nose, right cheek,
mouth, chin, and any other parts of the head. Fixations were
defined as a gaze that remained in a diameter of 25 pixels for at
least 100 ms.
Figure 1. Exemplary stimuli (afraid, angry, happy, neutral, sad) chosen from the KDEF picture set (Lundqvist
et al., 1998).
861
SCANNING EMOTIONAL FACES

Data Analysis
Fixation durations for each AOI, for the entire presentation
period, were included in analyses of variance with a factor area of
interest (eight steps) and a factor emotion (five steps). Fixation
frequencies for each area of interest, for the first and second
fixation as well as for the entire duration of a presentation, were
included in Friedman tests for ranks, as the data consist of ordinal
variables. Subsequently, in order to test the main hypotheses AOIs
were reduced to the main areas: the eye and mouth regions.
A direct comparison between the frequencies of fixations issued
to the AOIs was not possible because they were not statistically
independent. Ratios were therefore computed for both frequencies
of second fixations and fixation durations by dividing the differ-
ence of fixation numbers and the fixation durations of the eyes and
mouth by the sum of the fixation numbers and fixation durations of
the eyes and mouth, respectively. Thus, a positive ratio indicates
that there were more or longer fixations on the eyes, whereas
negative ratios depicted more or longer fixations on the mouth.
Bonferroni-corrected follow-up tests were conducted for signifi-
cant main effects and interactions.
Results
Mean Fixation Duration
Mean fixation duration differed between AOI, F(1, 35)
115.58; p .001;
2
.77. There was no main effect for emotion,
but for an interaction between emotion and AOI, F(4, 140) 3.26;
p .01;
2
.09. Follow-up analyses for the main effects of AOI
revealed significantly longer durations of the left and right eye
compared to the forehead [left: T (35) ⫽⫺5.66; p .001; right:
T (35) ⫽⫺5.66; p .001], of the left and right eye compared to
the left cheek [left: T (32) 5.19; p .001; right: T (32) 5.91;
p .001], and of the left and right eye compared to the nose [left:
T (35) 3.74; p .001; right: T (35) 4.52; p .001]. For the
mouth region, fixations were significantly longer compared to
fixations on the forehead, T (35) ⫽⫺3.91; p .001, the left
cheek, T (35) ⫽⫺5.22; p .001, and the nose, T (35) ⫽⫺4.09;
p .001. However, there were no significant differences between
fixations on the mouth or the right cheek (see Table 1).
Follow-up tests for the interaction between AOI and emotion
revealed significant effects for all five emotional expressions in the
mean fixation duration of the AOI: fearful, F(7, 77) 6.68; p
.001;
2
.38, angry, F(7, 77) 3.80; p .001;
2
.26, happy,
F(7, 84) 2.41; p .03;
2
.17, neutral, F(7, 70) 5.95; p
.001;
2
.37, and sad, F(7, 91) 3.01; p .01;
2
.19.
Separate tests, testing the factor emotion for each AOI, showed
significant results only for the right eye, F(4, 132) 3.15; p .02;
2
.09, but for no other facial region (see Table 1).
Mean Number of Fixations
The mean number of fixations revealed a main effect for AOI,
F(1, 35) 111.58; p .001;
2
.76 (see Figure 2
) and a trend
for an interaction between AOI and emotion, F(4, 140) 2.09;
p .09;
2
.06. There was no main effect for emotion.
Follow-up tests concerning the main effect for AOI by testing
differences in the numbers of fixations were significant for com-
parisons of the left or right eye and the forehead [left: T (35)
5.30; p .001; right: T (35) ⫽⫺5.15; p .001], as well as the
left cheek [left: T (35) 5.87; p .001; right: T (35) 5.82; p
.001], and the nose [left: T (35) 4.12; p .001; right: T (35)
4.35; p .001]. There were also more fixations on the mouth
compared to the left cheek, T (32) ⫽⫺4.64; p .001 or the nose,
T (35) ⫽⫺3.69; p .001.
Location of First and Second Fixations
When participants first looked at the screen at the beginning of
each trial, the very first fixation was most frequently issued to an
area outside of the face, including hair, ears, and throat,
2
(7,
36) 112.76; p .001. The locations of the first and second
fixations did not differ between emotional expressions. However,
the second fixation was more often issued to one of the facial
AOIs. There was a main effect for AOI,
2
(7, 36) 48.18; p
.001; the frequency at which they were looked at was the follow-
ing: left eye, right eye, mouth, nose, hair, forehead, left cheek, and
right cheek. There was no significant difference in the ranking
between the eyes and mouth, but significantly fewer second fixa-
tions were issued to all of the other regions compared with the eyes
(e.g., left eye–forehead: Z ⫽⫺3.27; p .001; right eye–forehead:
Z ⫽⫺2.77; p .01).
The ranking order for the AOI of each emotion yields differ-
ences between expressions. In sad [
2
(7, 36) 43.57; p .001]
and angry facial expressions [
2
(7,36) 26.15; p .001], the left
and right eyes were significantly more often the targets of the
second fixation compared with all other regions. In fearful [
2
(7,
Table 1
Mean Fixation Durations (Standard Deviations) for Each Category of Emotional Facial Expressions and Each Area of Interest
Facial expression
AOI Fearful Angry Happy Neutral Sad
Forehead 194.75 79.55 188.70 45.96 239.25 150.97 166.83 16.73 255.40 54.59
Left eye 524.81 51.78 578.03 177.20 620.56 341.45 541.79 66.41 635.50 141.28
Right eye 787.24 189.07 757.50 120.92 615.18 8.36 722.26 113.98 844.44 137.16
Left cheek 336.75 3.18 272.33 199.88 257.25 90.86 305.60 10.75 477.50 79.90
Nose 348.10 77.64 335.00 108.19 281.83 27.34 235.82 82.98 320.63 196.05
Right cheek 462.56 33.16 383.56 40.70 431.98 130.78 425.21 151.93 511.72 158.78
Mouth 427.25 72.48 327.64 30.20 416.32 93.79 396.04 63.98 359.29 77.02
Hair 354.28 125.90 434.61 203.73 408.51 60.74 394.12 52.66 395.91 192.03
862
EISENBARTH AND ALPERS

36) 40.56; p .001] and neutral [
2
(7, 36) 41.29; p .001]
facial expressions, the mouth region was the target of the second
fixation. In addition to these findings, in happy facial expressions,
the mouth and hair regions were fixated upon significantly more
often compared with the cheeks, nose, and forehead,
2
(7, 36)
44.31; p .001.
Ratios for Fixation Duration and Frequencies
The ratios of total fixation duration and fixation frequencies
were calculated in order to directly compare the emotion-relevant
differences in the main facial regions. For the ratio of total fixation
duration, there was a significant main effect of emotion, F(4,
140) 3.87; p .01;
2
.10. Follow-up tests revealed a higher
ratio for fearful facial expressions compared to happy ones, T
(35) 3.72; p .001, and a higher ratio for sad facial expressions
compared to happy ones, T (35) ⫽⫺2.83; p .01. Higher ratios
represent longer fixation times of the eye regions compared to the
mouth region (see Figure 3). There was a significant emotion
effect on the frequencies of second fixations,
2
(4, 36) 10.54;
p .03, which can be explained by a significant difference
between sad facial expressions and fearful (Z ⫽⫺2.08; p .04),
angry (Z ⫽⫺2.39; p .02), and happy (Z ⫽⫺2.11; p .04)
facial expressions.
Discussion
The way we look at faces has spurred scientific interest for
many years (Henderson et al., 2005; Yarbus, 1967). Additionally,
deviations in the scan paths of specific patient groups when look-
ing at emotional facial expressions have been linked to their
specific deficits in reading emotional faces (Dalton et al., 2005;
Hernandez et al., 2009). The aim of this study was to examine
differences in scan paths when healthy individuals look at different
emotional facial expressions. Across all trials, primary target re-
gions were the eye and mouth regions, including the photographed
person’s right cheek. For the duration of all fixations during
picture presentation, participants fixated on the mouth region lon-
ger in happy facial expressions compared with sad and fearful
facial expressions. The number of fixations showed that the eye
region is most frequently fixated upon in all emotional expres-
sions, but in fearful, happy, and neutral facial expressions, the
mouth and right cheek regions are also fixated upon as frequently
as the eye region. In sad and angry facial expressions, the eye
region is most frequently fixated upon. Concerning the first fixa-
tion on the face ( second fixation in general) in sad facial
expressions, participants more often looked at the eyes in compar-
ison with all other emotional expressions.
The ratios that we computed for fixation durations on the eyes
in relation to the mouth region underline these findings in a more
direct way. The relative duration of fixations in the eye region is
dominant, but less dominant for happy facial expressions com-
pared to fearful, neutral, and sad expressions. In terms of the
numbers of first fixations, the eye region is less dominant com-
pared to the mouth region in angry facial expressions compared to
sad expressions.
These results are in accordance with previous studies comparing
healthy participants and patients. In healthy participants, the gaze
is directed more often to the eye and mouth regions compared with
other regions of a facial expression (see also Spezio, Adolphs,
Hurley, & Piven, 2007). According to these authors, the eye region
is fixated upon more often and for longer durations compared with
the other facial regions, independent of emotional category. This
finding is plausible because important cues for emotional infor-
mation can be found in this region (Ekman & Friesen, 1969).
Striking evidence for the informational content of the eyes
comes from studies that ask participants to define someone’s
emotional state just from seeing the eye region in scan paths when
different emotional expressions are examined. In sad facial expres-
sions, the first fixation on the face is to the eyes; in happy facial
expressions, the mouth region is fixated upon longer compared
with the other emotional expressions. Thus, if participants want to
Figure 2. Mean fixation duration (ms) for each area of interest and
standard error of the mean for all emotional facial expressions (front
forehead; eye-l left eye; eye-r right eye; cheek-l left cheek;
cheek-r right cheek).
Figure 3. Mean ratio of fixation duration (duration for eyes duration for
mouth/duration for eyes duration for mouth) for each emotional category
and standard error of the mean; larger ratios indicate longer durations for
the eyes compared to the mouth.
863
SCANNING EMOTIONAL FACES

decide for valence and arousal of a facial expression, eye gaze is
directed to those regions where important emotion-specific infor-
mation can be found: the smiling mouth or the sad eyes. This
observation supports the idea that it is highly important for social
interaction to know the emotional state of another person, and it
facilitates good emotion discrimination capacities that have been
found in different experimental paradigms (Calvo & Lundqvist,
2008; Calvo & Nummenmaa, 2009). Although these results seem
trivial, to our knowledge, there is no study investigating the scan
path in facial expressions presenting different emotional states.
However, it is not expected that the right cheek is as often
fixated upon and for as long a duration as the mouth region. One
explanation could be that, although insignificant, the number and
durations of fixations in our data follow a right bias, which could
explain higher exploration of the right cheek. This finding would
not be in line with previous results, where predominantly left
biases have been found in face perception (Gilbert & Bakan, 1973;
Levy et al., 1983). The dominance of the right cheek could also be
due to the proximity of the mouth region to both cheek regions
and, therefore, might be an artifact of choosing stable areas of
interest for analysis. This would still suggest a right bias for the
lower facial region across all emotional categories, although par-
ticipants showed predominantly the more common left-bias in the
CFT. A hypothesis concerning the right bias could be that the
left-bias is more related to the initial processing of faces and
especially related to reaction times, whereas long periods of ex-
ploration time are less related to this bias. In our data, the left eye
descriptively but insignificantly was more often the target of first
relevant fixation, which would support the hypothesis. A right-bias
in face perception has been found to be present for positive facial
expressions (Workman, Peters, & Taylor, 2000) and for face
recognition (Laeng & Rouw, 2001). We did not find any differ-
ences between positive and negative facial expressions in this
tendency of a right bias, but the task of evaluating the pictures for
valence and arousal could relate to left hemisphere dominance.
A recent study with healthy individuals found that known faces
can be recognized with only one fixation (Hsiao & Cottrell, 2008).
This first fixation is located above chance at the nose region of the
picture. As soon as the authors gave participants time for more
fixations to explore the face, recognition performance was further
enhanced. Second fixations also have been found to be located
above chance at the nose region, whereas the third fixation was
directed toward the eye region. The authors conclude that these
findings point to a holistic perception of facial expressions. Al-
though in this study no emotional facial expressions were used,
again the eye region appears to have a strong impact on face
recognition. However, this study raises the question if face per-
ception is still holistic when participants are asked to decide which
emotion is being displayed and not if they know the person. Thus,
future studies should address this query by answering the question
of how many fixations are needed to decide the emotional content
of a facial expression.
The present study confirms that the eyes are particularly impor-
tant to read emotional expressions. Yet, some limitations must be
considered. First, we did not experimentally control for the starting
point of the scan path in each trial. Often, experiments accomplish
such control with the help of a fixation cross before picture
presentation. Instead, we decided to examine scan paths under
more naturalistic viewing conditions and without such an experi-
mental constraint. Second, subsequent studies should not only
include arousal and valence ratings but also classifications of the
expressions in order to include classification performance. In ad-
dition, presentation time should be varied, starting with very brief
presentations to determine how many fixations are needed to
correctly identify the emotional content of a facial expression. This
addition could add relevant evidence to aid in the understanding of
the processing of social and emotional contents.
In sum, our study supports the importance of the eye and mouth
regions to facial perception. Moreover, it extends our knowledge
in showing that scan paths of healthy observers differ for different
emotional facial expressions. This specificity of gaze pattern may
be due to specific emotional cues provided by specific regions of
the face.
References
Adolph, D., & Alpers, G. W. (2010). Differences in valence and arousal:
A comparison of two sets of emotional facial expressions. American
Journal of Psychology, 123, 209–219.
Adolphs, R., Gosselin, F., Buchanan, T. W., Tranel, D., Schyns, P., &
Damasio, A. R. (2005). A mechanism for impaired fear recognition after
amygdala damage. Nature, 433(7021), 68–72. doi: 10.1038/
nature03086
Alpers, G. W. (2008). Eye-catching: Right hemisphere attentional bias for
emotional pictures. Laterality: Asymetries of Body, Brain and Cognition,
13, 158–178. doi: 10.1080/13576500701779247
Alpers, G. W., & Gerdes, A. B. M. (2007). Here’s looking at you:
Emotional faces predominate in binocular rivalry. Emotion, 7(3), 495–
506. doi: 10.1037/1528 –3542.7.3.495
Bediou, B., Franck, N., Saoud, M., Baudouin, J.-Y., Tiberghien, G., Dal-
ery, J., et al. (2005). Effects of emotion and identity on facial affect
processing in schizophrenia. Psychiatry Research, 133(2–3), 149 –157.
doi: 10.1016/j.psychres.2004.08.008
Boraston, Z., Blakemore, S.-J., Chilvers, R., & Skuse, D. (2007). Impaired
sadness recognition is linked to social interaction deficit in autism.
Neuropsychologia, 45(7), 1501–1510. doi: 10.1016/j.neuropsychologia
.2006.11.010
Butler, S., Gilchrist, I. D., Burt, D. M., Perrett, D. I., Jones, E., & Harvey,
M. (2005). Are the perceptual biases found in chimeric face processing
reflected in eye-movement patterns? Neuropsychologia, 43(1), 52–59.
doi: 10.1016/j.neuropsychologia.2004.06.005
Calvo, M. G., & Lundqvist, D. (2008). Facial expressions of emotion
(KDEF): Identification under different display-duration conditions. Be-
havior Research Methods, 40, 109–115. doi: 10.3758/BRM.40.1.109
Calvo, M. G., & Nummenmaa, L. (2009). Eye-movement assessment of the
time course in facial expression recognition: Neurophysiological impli-
cations. Cognitive, Affective, & Behavioral Neuroscience, 9(4), 398
411. doi: 10.3758/cabn.9.4.398
Dalton, K. M., Nacewicz, B. M., Johnstone, T., Schaefer, H. S., Gern-
sbacher, M. A., Goldsmith, H. H., et al. (2005). Gaze fixation and the
neural circuitry of face processing in autism. Nature Neuroscience, 8(4),
519–526. doi: 10.1038/nn1421
Dimberg, U., Thunberg, M., & Grunedal, S. (2002). Facial reactions to
emotional stimuli: Automatically controlled emotional responses. Cog-
nition & Emotion, 16(4), 449 471. doi: 10.1080/02699930143000356
Eisenbarth, H., Alpers, G. W., Segre`, D., Calogero, A., & Angrilli, A.
(2008). Categorization and evaluation of emotional faces in psycho-
pathic women. Psychiatry Research, 159(1–2), 189–195. doi: 10.1016/
j.psychres.2007.09.001
Ekman, P., & Friesen, W. V. (1969). The repertoire of nonverbal behavior:
Categories, origins, usage, and coding. Semiotica, 1(1), 49 –98.
Engbert, R., & Kliegl, R. (2003). Microsaccades uncover the orientation of
864
EISENBARTH AND ALPERS

Citations
More filters
Journal ArticleDOI

Wearing Face Masks Strongly Confuses Counterparts in Reading Emotions

TL;DR: Compensatory actions that can keep social interaction effective (e.g., body language, gesture, and verbal communication), even when relevant visual information is crucially reduced are discussed.
Journal ArticleDOI

The eyes are not the window to basic emotions

TL;DR: It is proposed that the greater utilization of the mouth area by the human participants might come from remnants of the strategy the brain has developed with dynamic stimuli, and/or from a strategy whereby the most informative area is prioritized due to the limited capacity of the visuo-cognitive system.
Journal ArticleDOI

Do the eyes really have it? Dynamic allocation of attention when viewing moving faces

TL;DR: The results argue against a general prioritization of the eyes and support a more functional, information-seeking use of gaze allocation during dynamic face viewing.
Journal ArticleDOI

Featural processing in recognition of emotional facial expressions

TL;DR: The complexity of the results suggests that the recognition process of emotional facial expressions cannot be reduced to a simple feature processing or holistic processing for all emotions.
References
More filters

Manual for the State-Trait Anxiety Inventory

TL;DR: The STAI as mentioned in this paper is an indicator of two types of anxiety, the state and trait anxiety, and measure the severity of the overall anxiety level, which is appropriate for those who have at least a sixth grade reading level.
Journal ArticleDOI

The Fusiform Face Area: A Module in Human Extrastriate Cortex Specialized for Face Perception

TL;DR: The data allow us to reject alternative accounts of the function of the fusiform face area (area “FF”) that appeal to visual attention, subordinate-level classification, or general processing of any animate or human forms, demonstrating that this region is selectively involved in the perception of faces.
Journal ArticleDOI

Eye movements in reading and information processing: 20 years of research.

TL;DR: The basic theme of the review is that eye movement data reflect moment-to-moment cognitive processes in the various tasks examined.
Book

Eye Movements and Vision

Journal ArticleDOI

The Repertoire of Nonverbal Behavior : Categories, Origins, Usage, and Coding

Paul Ekman, +1 more
- 01 Jan 1969 - 
TL;DR: In this article, the authors focus on three fundamental considerations of nonverbal behavior: origin, usage and codeword, and the rules that explain how the behavior contains or conveys information.
Related Papers (5)
Frequently Asked Questions (13)
Q1. What contributions have the authors mentioned in the paper "Happy mouth and sad eyes: scanning emotional facial expressions" ?

There is evidence that specific regions of the face such as the eyes are particularly relevant for the decoding of emotional expressions, but it has not been examined whether scan paths of observers vary for facial expressions with different emotional content. In this study, eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions. Their data suggest that people look at regions that are most characteristic for each emotion. 

Thus, future studies should address this query by answering the question of how many fixations are needed to decide the emotional content of a facial expression. 

A hypothesis concerning the right bias could be that the left-bias is more related to the initial processing of faces and especially related to reaction times, whereas long periods of exploration time are less related to this bias. 

Thirty-six psychology students (20 female, 16 male) were recruited at the University of Würzburg and received course credit for their participation. 

In sad facial expressions, the first fixation on the face is to the eyes; in happy facial expressions, the mouth region is fixated upon longer compared with the other emotional expressions. 

subsequent studies should not only include arousal and valence ratings but also classifications of the expressions in order to include classification performance. 

In fearful [ 2 (7,36) 40.56; p .001] and neutral [ 2 (7, 36) 41.29; p .001] facial expressions, the mouth region was the target of the second fixation. 

The mean score of the Chimeric Faces Test (CFT; Levy, Heller, Banich, & Burton, 1983) was M 0.21 (SD 0.40; range: 1.00 to 0.81), which corresponds to the normal left-bias for emotion detection literature. 

Fixation frequencies for each area of interest, for the first and second fixation as well as for the entire duration of a presentation, were included in Friedman tests for ranks, as the data consist of ordinal variables. 

For the duration of all fixations during picture presentation, participants fixated on the mouth region longer in happy facial expressions compared with sad and fearful facial expressions. 

Follow-up tests for the interaction between AOI and emotion revealed significant effects for all five emotional expressions in the mean fixation duration of the AOI: fearful, F(7, 77) 6.68; p.001; 2 .38, angry, F(7, 77) 3.80; p .001; 2 .26, happy, F(7, 84) 2.41; p .03; 2 .17, neutral, F(7, 70) 5.95; p .001; 2 .37, and sad, F(7, 91) 3.01; p .01; 2 .19. 

Striking evidence for the informational content of the eyes comes from studies that ask participants to define someone’s emotional state just from seeing the eye region in scan paths when different emotional expressions are examined. 

There was a significant emotion effect on the frequencies of second fixations, 2 (4, 36) 10.54; p .03, which can be explained by a significant difference between sad facial expressions and fearful (Z 2.08; p .04), angry (Z 2.39; p .02), and happy (Z 2.11; p .04) facial expressions.