scispace - formally typeset
Search or ask a question

Showing papers on "Facial Action Coding System published in 1996"


01 Jan 1996
TL;DR: Cross-cultural research on facial expression and the developments of methods to measure facial expression are briefly summarized and what has been learned about emotion from this work on the face is elucidated.
Abstract: Cross-cultural research on facial expression and the developments of methods to measure facial expression are briefly summarized. What has been learned about emotion from this work on the face is then elucidated. Four questions about facial expression and emotion are discussed. What information does an expression typically convey? Can there be emotion without facial expression? Can there be a facial expression of emotion without emotion? How do individuals differ in their facial expressions of emotion?

2,463 citations


Journal ArticleDOI
TL;DR: In this paper, a rule-governed approach is proposed to generate high-quality 3D on-imotion of fociol expressions in conjunction with meaning-based speech synthesis, including spoken intonation.

257 citations


Journal ArticleDOI
TL;DR: Spontaneous facial expression appears to be selectively affected in PD, whereas posed expression and emotional experience remain relatively intact.
Abstract: Spontaneous and posed emotional facial expressions in individuals with Parkinson's disease (PD, n – 12) were compared with those of healthy age-matched controls (n = 12). The intensity and amount of facial expression in PD patients were expected to be reduced for spontaneous but not posed expressions. Emotional stimuli were video clips selected from films, 2–5 min in duration, designed to elicit feelings of happiness, sadness, fear, disgust, or anger. Facial movements were coded using Ekman and Friesen's (1978) Facial Action Coding System (FACS). In addition, participants rated their emotional experience on 9-point Likert scales. The PD group showed significantly less overall facial reactivity than did controls when viewing the films. The predicted Group X Condition (spontaneous vs. posed) interaction effect on smile intensity was found when PD participants with more severe disease were compared with those with milder disease and with controls. In contrast, ratings of emotional experience were similar for both groups. Depression was positively associated with emotion ratings, but not with measures of facial activity. Spontaneous facial expression appears to be selectively affected in PD, whereas posed expression and emotional experience remain relatively intact. (JINS, 1996, 2, 383–391.)

145 citations


Journal ArticleDOI
TL;DR: For instance, this article found that more facial movements involving elements of the fear expression and more eye blinks were displayed during high vs low anxious segments. But facial actions indicative of other affect states (anger, sadness) did not distinguish anxiety level.

96 citations


Journal ArticleDOI
TL;DR: This study addresses how affective facial expression interacts with the linguistic forms in ASL motherese and sheds new insight on the nature and possible role of input on the language acquisition process.
Abstract: Research on early mother-child interaction has documented the crucial role affect plays in the content and modulation of early interactions For hearing mothers, voice quality is considered to be the single most informative channel for affective expression For deaf caregivers who use American Sign Language (ASL), the vocal channel is unavailable, and facial expression is critically important Not only do facial behaviours signal affective and communicative information, but specific facial behaviours also function as obligatory grammatical markers This multifunctionality of facial expression presents a dilemma for deaf parents signing to their toddlers as these two systems potentially compete for expression on the face This study addresses how affective facial expression interacts with the linguistic forms in ASL motherese To address this issue, we present data from both cross-sectional and longitudinal videotaped interaction from a total of 15 deaf mothers signing with their deaf toddlers (ages 0;9-2;8) Using Ekman & Friesen's Facial Action Coding System (FACS) (1978) we analysed child-directed maternal wh- questions Because they are frequent in early discourse, and they require furrowed brows which also signal anger and puzzlement, wh- questions represent an ideal context to address the potential conflict of grammatical and affective facial expression in ASL motherese Our studies indicate a shift from affect to grammar at about the child's second birthday These findings shed new insight on the nature and possible role of input on the language acquisition process

78 citations


Proceedings ArticleDOI
14 Oct 1996
TL;DR: This work has implemented an interface that tracks a person's facial features in real time (30 Hz) and can recognise a large set of gestures ranging from "yes", "no" and "may be" to detecting winks, blinks and sleeping.
Abstract: People naturally express themselves through facial gestures and expressions. Our goal is to build a facial gesture human-computer interface for use in robot applications. We have implemented an interface that tracks a person's facial features in real time (30 Hz). Our system does not require special illumination nor facial makeup. By using multiple Kalman filters we accurately predict and robustly track facial features. This is despite disturbances and rapid movements of the head (including both translational and rotational motion). Since we reliably track the face in real-time we are also able to recognise motion gestures of the face. Our system can recognise a large set of gestures (13) ranging from "yes", "no" and "may be" to detecting winks, blinks and sleeping.

48 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined emotionality and affective exchange during psychotherapeutic treatments by investigating how these processes are related to each other and to treatment outcome, and found that high correlations were found between facial expressions of emotion and the therapist's affective experience.
Abstract: This study examined emotionality and affective exchange during psychotherapeutic treatments by investigating how these processes are related to each other and to treatment outcome. Facial expressions of emotion and its temporal sequence were described using the Emotional Facial Action Coding System (EMFACS) for two videotaped brief psychoanalytic treatments, a successful one and a therapeutic failure. Patient and therapist affective experience were measured with the Differentielle Affekt-Skala (DAS). High correlations were found between facial expressions of emotion and affective experience. Some correlations were the same in both dyads, but most were dyad-specific. Furthermore, aspects of facial behavior and the interaction partner's responses were both good predictors of affective experience. Interpretation of the meaning of facial behavior and the measurement of affective experience provides insights into the nature of the psychotherapeutic process and its relation to success and failure. The two treat...

45 citations


Journal ArticleDOI
TL;DR: A phoneme based approach of speech animation resembles actual speech and allows arbitrary English text rather than a restricted set of tokens be spoken and a Facial Action Coding System is adopted to control the modification of the face model as it describes the basis of facial expression.

29 citations


Journal ArticleDOI
01 Aug 1996-Displays
TL;DR: In this article, a 3D face model, expression model and emotion model are presented to realize a face-to-face communication environment with machine by giving a facial expression to computer system.

27 citations


Journal ArticleDOI
TL;DR: In this article, the meaning of facial expressions of emotion and the emotion terms frequently used to label them were analyzed with respect to the constituent facial movements using the Facial Action Coding System, and using consensus analysis, multidimensional scaling and inferential statistics.
Abstract: How similar are the meanings of facial expressions of emotion and the emotion terms frequently used to label them? In three studies, subjects made similarity judgments and emotion self-report ratings in response to six emotion categories represented in Ekman and Friesen's Pictures of Facial Affect, and their associated labels. Results were analyzed with respect to the constituent facial movements using the Facial Action Coding System, and using consensus analysis, multidimensional scaling, and inferential statistics. Shared interpretation of meaning was found between individuals and the group, with congruence between the meaning in facial expressions, labeling using basic emotion terms, and subjects' reported emotional responses. The data suggest that (1) the general labels used by Ekman and Friesen are appropriate but may not be optimal, (2) certain facial movements contribute more to the perception of emotion than do others, and (3) perception of emotion may be categorical rather than dimensional.

18 citations


Journal ArticleDOI
TL;DR: The authors examined the nonverbal correlates of repressive coping, extending previous research in two ways: participants' nonverbal behaviors were observed in either of two conditions that differed with respect to the saliency of public identity; and an anatomically-based facial coding system was used to assess participants' emotion expressions and symbolic communication behaviors.
Abstract: The present study examined the nonverbal correlates of repressive coping, extending previous research in two ways: (1) participants' nonverbal behaviors were observed in either of two conditions that differed with respect to the salience of public identity; (2) an anatomically-based facial coding system was used to assess participants' emotion expressions and symbolic communication behaviors. Sixty female undergraduates, classified as repressive, low-anxious, or high-anxious, were videotaped during the preparation and delivery of a self-disclosing speech. During both the preparation and delivery, the salience of participants' public identities was either minimized (low-salience condition) or maximized (high-salience condition). Repressors and nonrepressors exhibited similar frequencies of hostile facial expressions. Repressors differed from nonrepressors by their frequent expressions of social smiles and conversational illustrators when their public selves were most salient. These findings suggest that certain symbolic communication behaviors may be nonverbal analogues of cognitive coping processes, and they support the utility of including expressive behaviors in conceptualizations of emotion-focused coping.

Journal ArticleDOI
TL;DR: Differences in facial expression in DS adults may confuse others' interpretations of their emotional responses and may be important for understanding the development of abnormal emotional processes.
Abstract: The facial expressions of adults with Down's syndrome (DS; n = 15) as they watched happy, sad, and neutral videotapes were compared with those of a healthy age-matched control group (n = 20). Facial movements were analyzed with the Facial Action Coding System (P. E. Ekman & W. V. Friesen, 1978). While watching happy stimuli, the 10 DS adults who were able to appropriately rate their reactions smiled with a cheek raise as frequently as control adults, suggesting that the expression of positive affect in these individuals is normal. Contrary to predictions, however, the DS group exhibited fewer smiles without cheek raises than did control adults and were more likely not to smile. Neither group showed prototypic sad facial expressions in response to sad stimuli. Independent of emotion, DS participants made more facial movements, including more tongue shows, than did control participants. Differences in facial expression in DS adults may confuse others' interpretations of their emotional responses and may be important for understanding the development of abnormal emotional processes.

Journal ArticleDOI
TL;DR: The asymmetries of facial expression were estimated in a sample of 14 experimental subjects with the Facial Action Coding System during voluntary control of facial mimicry while viewing videotapes to show an asymmetric distribution toward the lower left side of the face.
Abstract: The asymmetrics of facial expression were estimated in a sample of 14 experimental subjects with the Facial Action Coding System during voluntary control of facial mimicry while viewing videotapes. The subjects were instructed to express facially the emotion experienced or to dissimulate their true emotion with a facial expression opposite (incongruous) to what they actually felt. Only during dissimulation did facial mimicry show an asymmetric distribution toward the lower left side of the face.

Bischof N1
01 Jan 1996
TL;DR: A formal theory is proposed that accounts for the motivational architecture underlying the spontaneous smiling response in its most salient varieties, based on a single assumption according to which the smiling response is due to a reduction of "autonomy claim" within the framework of the Zurich Model of Social Motivation.
Abstract: A formal theory is proposed that accounts for the motivational architecture underlying the spontaneous smiling response in its most salient varieties (i.e., smiling due to security, relief, embarrassment, fear, amazement, submission, and triumph). The theory is based on a single assumption according to which the smiling response, in all instances named, is due to a reduction of "autonomy claim" as defined within the framework of the Zurich Model of Social Motivation. The theory's consistency is evinced by way of computer simulation, its phenomenological plausibility can be demonstrated by animation based on Ekman's Facial Action Coding System.


Journal ArticleDOI
TL;DR: The methodological wisdom of using the coding strategies developed for older populations of infants with high-risk, premature infants who are tested while still in the hospital is evaluated and a new catalog of six new categories of action is developed.
Abstract: The techniques of facial coding employed with full-term infants have been applied to studies of premature infants tested before 40 weeks postconceptional age The purpose of this research was to evaluate the methodological wisdom of using the coding strategies that were developed for older populations of infants with high-risk, premature infants who are tested while still in the hospital As a first step, we tested the eight most commonly used categories of facial activity The results were not encouraging; the meanr(β) was 61 and the mean κ was 52 On the basis of our assessment of the limitations in using these traditional facial categories, we developed six new categories of action This time the results were more positive; the meanr(β) was 92 and the mean κ was 80 The new catalog appears to be a reliable representation of a range of facial behaviors observed in infants who are tested well before term age