scispace - formally typeset
Topic

Facial expression

About: Facial expression is a(n) research topic. Over the lifetime, 17085 publication(s) have been published within this topic receiving 639905 citation(s). The topic is also known as: expression.

...read more

Papers
More filters

Journal ArticleDOI
TL;DR: Evidence is provided that members of a preliterate culture who had minimal exposure to literate cultures would associate the same emotion concepts with the same facial behaviors as do members of Western and Eastern literates.

...read more

Abstract: This study addresses the question of whether any facial expressions of emotion are universal. Recent studies showing that members of literate cultures associated the same emotion concepts with the same facial behaviors could not demonstrate that at least some facial expressions of emotion are universal; the cultures compared had all been exposed to some of the same mass media presentations of facial expression, and these may have taught the people in each culture to recognize the unique facial expressions of other cultures. To show that members of a preliterate culture who had minimal exposure to literate cultures would associate the same emotion concepts with the same facial behaviors as do members of Western and Eastern literate cultures, data were gathered in New Guinea by telling subjects a story, showing them a set of three faces, and asking them to select the face which showed the emotion appropriate to the story. The results provide evidence in support of the hypothesis that the association between particular facial muscular patterns and discrete emotions is universal.

...read more

3,685 citations


Journal ArticleDOI
01 May 1993-Psychophysiology
TL;DR: Responsibility specificity, particularly facial expressiveness, supported the view that specific affects have unique patterns of reactivity, and consistency of the dimensional relationships between evaluative judgments and physiological response emphasizes that emotion is fundamentally organized by these motivational parameters.

...read more

Abstract: Colored photographic pictures that varied widely across the affective dimensions of valence (pleasant-unpleasant) and arousal (excited-calm) were each viewed for a 6-s period while facial electromyographic (zygomatic and corrugator muscle activity) and visceral (heart rate and skin conductance) reactions were measured. Judgments relating to pleasure, arousal, interest, and emotional state were measured, as was choice viewing time. Significant covariation was obtained between (a) facial expression and affective valence judgments and (b) skin conductance magnitude and arousal ratings. Interest ratings and viewing time were also associated with arousal. Although differences due to the subject's gender and cognitive style were obtained, affective responses were largely independent of the personality factors investigated. Response specificity, particularly facial expressiveness, supported the view that specific affects have unique patterns of reactivity. The consistency of the dimensional relationships between evaluative judgments (i.e., pleasure and arousal) and physiological response, however, emphasizes that emotion is fundamentally organized by these motivational parameters.

...read more

2,913 citations


Journal ArticleDOI
Andrew N. Meltzoff1, M. Keith Moore2Institutions (2)
07 Oct 1977-Science
TL;DR: Infants between 12 and 21 days of age can imitate both facial and manual gestures; this behavior cannot be explained in terms of either conditioning or innate releasing mechanisms.

...read more

Abstract: Infants between 12 and 21 days of age can imitate both facial and manual gestures; this behavior cannot be explained in terms of either conditioning or innate releasing mechanisms. Such imitation implies that human neonates can equate their own unseen behaviors with gestures they see others perform.

...read more

2,814 citations


Journal ArticleDOI
Nim Tottenham1, James W. Tanaka2, Andrew C. Leon1, Thomas McCarry1  +6 moreInstitutions (4)
TL;DR: The results lend empirical support for the validity and reliability of this set of facial expressions as determined by accurate identification of expressions and high intra-participant agreement across two testing sessions, respectively.

...read more

Abstract: A set of face stimuli called the NimStim Set of Facial Expressions is described. The goal in creating this set was to provide facial expressions that untrained individuals, characteristic of research participants, would recognize. This set is large in number, multiracial, and available to the scientific community online. The results of psychometric evaluations of these stimuli are presented. The results lend empirical support for the validity and reliability of this set of facial expressions as determined by accurate identification of expressions and high intra-participant agreement across two testing sessions, respectively.

...read more

2,692 citations


Proceedings ArticleDOI
Takeo Kanade1, Jeffrey F. Cohn1, Yingli Tian1Institutions (1)
26 Mar 2000-
TL;DR: The problem space for facial expression analysis is described, which includes level of description, transitions among expressions, eliciting conditions, reliability and validity of training and test data, individual differences in subjects, head orientation and scene complexity image characteristics, and relation to non-verbal behavior.

...read more

Abstract: Within the past decade, significant effort has occurred in developing methods of facial expression analysis. Because most investigators have used relatively limited data sets, the generalizability of these various methods remains unknown. We describe the problem space for facial expression analysis, which includes level of description, transitions among expressions, eliciting conditions, reliability and validity of training and test data, individual differences in subjects, head orientation and scene complexity image characteristics, and relation to non-verbal behavior. We then present the CMU-Pittsburgh AU-Coded Face Expression Image Database, which currently includes 2105 digitized image sequences from 182 adult subjects of varying ethnicity, performing multiple tokens of most primary FACS action units. This database is the most comprehensive testbed to date for comparative studies of facial expression analysis.

...read more

2,591 citations


Network Information
Related Topics (5)
Emotion classification

5.7K papers, 287.2K citations

92% related
Eye tracking

17.1K papers, 370.8K citations

91% related
Emotional expression

8.2K papers, 406K citations

90% related
Salience (neuroscience)

3.5K papers, 151.2K citations

88% related
Visual perception

20.8K papers, 997.2K citations

88% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202229
20211,021
20201,078
20191,108
20181,051
2017979

Top Attributes

Show by:

Topic's top 5 most impactful authors

Maja Pantic

72 papers, 11.2K citations

Jeffrey F. Cohn

71 papers, 10.3K citations

Paul Ekman

56 papers, 29.8K citations

Wataru Sato

42 papers, 2.1K citations

Beatrice de Gelder

40 papers, 3.8K citations