scispace - formally typeset
Search or ask a question

Showing papers on "Facial expression published in 1996"


01 Jan 1996
TL;DR: Cross-cultural research on facial expression and the developments of methods to measure facial expression are briefly summarized and what has been learned about emotion from this work on the face is elucidated.
Abstract: Cross-cultural research on facial expression and the developments of methods to measure facial expression are briefly summarized. What has been learned about emotion from this work on the face is then elucidated. Four questions about facial expression and emotion are discussed. What information does an expression typically convey? Can there be emotion without facial expression? Can there be a facial expression of emotion without emotion? How do individuals differ in their facial expressions of emotion?

2,463 citations


Journal ArticleDOI
31 Oct 1996-Nature
TL;DR: Direct in vivo evidence of a differential neural response in the human amygdala to facial expressions of fear and happiness is reported, providing direct evidence that the humangdala is engaged in processing the emotional salience of faces, with a specificity of response to fearful facial expressions.
Abstract: The amygdala is thought to play a crucial role in emotional and social behaviour. Animal studies implicate the amygdala in both fear conditioning and face perception. In humans, lesions of the amygdala can lead to selective deficits in the recognition of fearful facial expressions and impaired fear conditioning, and direct electrical stimulation evokes fearful emotional responses. Here we report direct in vivo evidence of a differential neural response in the human amygdala to facial expressions of fear and happiness. Positron-emission tomography (PET) measures of neural activity were acquired while subjects viewed photographs of fearful or happy faces, varying systematically in emotional intensity. The neuronal response in the left amygdala was significantly greater to fearful as opposed to happy expressions. Furthermore, this response showed a significant interaction with the intensity of emotion (increasing with increasing fearfulness, decreasing with increasing happiness). The findings provide direct evidence that the human amygdala is engaged in processing the emotional salience of faces, with a specificity of response to fearful facial expressions.

1,954 citations


Journal ArticleDOI
TL;DR: It is found that all subjects recognized happy expressions normally but that some subjects were impaired in recognizing negative emotions, especially fear and sadness, and this data provides evidence for a neural system important to processing facial expressions of some emotions, involving discrete visual and somatosensory cortical sectors in right hemisphere.
Abstract: This study is part of an effort to map neural systems involved in the processing of emotion, and it focuses on the possible cortical components of the process of recognizing facial expressions. We hypothesized that the cortical systems most responsible for the recognition of emotional facial expressions would draw on discrete regions of right higher-order sensory cortices and that the recognition of specific emotions would depend on partially distinct system subsets of such cortical regions. We tested these hypotheses using lesion analysis in 37 subjects with focal brain damage. Subjects were asked to recognize facial expressions of six basic emotions: happiness, surprise, fear, anger, disgust, and sadness. Data were analyzed with a novel technique, based on three-dimensional reconstruction of brain images, in which anatomical description of surface lesions and task performance scores were jointly mapped onto a standard brain-space. We found that all subjects recognized happy expressions normally but that some subjects were impaired in recognizing negative emotions, especially fear and sadness. The cortical surface regions that best correlated with impaired recognition of emotion were in the right inferior parietal cortex and in the right mesial anterior infracalcarine cortex. We did not find impairments in recognizing any emotion in subjects with lesions restricted to the left hemisphere. These data provide evidence for a neural system important to processing facial expressions of some emotions, involving discrete visual and somatosensory cortical sectors in right hemisphere.

748 citations


Journal ArticleDOI
TL;DR: Impairments in the identification of facial and vocal emotional expression were demonstrated in a group of patients with ventral frontal lobe damage who had socially inappropriate behaviour and may contribute to the abnormal behaviour seen after frontal lesions, and have implications for rehabilitation.

714 citations


Journal ArticleDOI
TL;DR: In this paper, two people with impaired recognition of facial expressions in the context of bilateral amygdala damage were tested with photographs showing facial expressions of emotion from the Ekman and Friesen (1976) series, both DR and SE showed deficits in the recognition of fear.
Abstract: Although the amygdala is widely believed to have a role in the recognition of emotion, a central issue concerns whether it is involved in the recognition of all emotions or whether it is more important to some emotions than to others. We describe studies of two people, DR and SE, with impaired recognition of facial expressions in the context of bilateral amygdala damage. When tested with photographs showing facial expressions of emotion from the Ekman and Friesen (1976) series, both DR and SE showed deficits in the recognition of fear. Problems in recognising fear were also found using photographic quality images interpolated (“morphed”) between prototypes of the six emotions in the Ekman and Friesen (1976) series to create a hexagonal continuum (running from happiness to surprise to fear to sadness to disgust to anger to happiness). Control subjects identified these morphed images as belonging to distinct regions of the continuum, corresponding to the nearest prototype expression. However, DR and SE were...

651 citations


Journal ArticleDOI
TL;DR: The authors found that facial expressions of emotion convey not only a target's internal state, but also differentially convey interpersonal information, which could potentially seed trait inference, and showed that facial expression can serve the interpersonal function of allowing one animal to predict another's behavior.
Abstract: Theorists have argued that facial expressions of emotion serve the interpersonal function of allowing one animal to predict another's behavior. Humans may extend these predictions into the indefinite future, as in the case of trait inference. The hypothesis that facial expressions of emotion (e.g., anger, disgust, fear, happiness, and sadness) affect subjects' interpersonal trait inferences (e.g., dominance and affiliation) was tested in two experiments. Subjects rated the dispositional affiliation and dominance of target faces with either static or apparently moving expressions. They inferred high dominance and affiliation from happy expressions, high dominance and low affiliation from angry and disgusted expressions, and low dominance from fearful and sad expressions. The findings suggest that facial expressions of emotion convey not only a target's internal state, but also differentially convey interpersonal information, which could potentially seed trait inference.

606 citations


Journal ArticleDOI
TL;DR: The facial grading system (FGS) is based on the evaluation of resting symmetry, degree of excursion of facial muscles and degree of synkinesis associated with each voluntary movement and a cumulative composite score, tabulated.
Abstract: Clinicians require a reliable and valid method of evaluating facial function following facial nerve injury. This tool should be clinically relevant, easy to administer, provide a quantitative score for reporting purposes, and be sensitive enough to detect clinically important change over time or with treatment. The objectives of this study were to develop and validate a well-defined grading system that would address the above mentioned points. All essential information, including precise definitions for each item, is presented on one page (Fig. 1, see next page). The facial grading system (FGS) is based on the evaluation of resting symmetry, degree of excursion of facial muscles and degree of synkinesis associated with each voluntary movement. Different regions of the face are examined separately using five standard expressions. All items are graded on point scales and a cumulative composite score, tabulated.

509 citations


Journal ArticleDOI
01 Oct 1996-Brain
TL;DR: Face perception and emotion recognition were investigated in a group of people with Huntington's disease and matched controls, showing that the recognition of some emotions is more impaired than others and disgust is a prime candidate.
Abstract: Face perception and emotion recognition were investigated in a group of people with Huntington's disease and matched controls. In conventional tasks intended to explore the perception of age, sex, unfamiliar face identity (Benton test) and gaze direction from the face, the Huntington's disease group showed a borderline impairment of gaze direction perception and were significantly impaired on unfamiliar face matching. With a separate set of tasks using computerinterpolated ('morphed') facial images, people with Huntington's disease were markedly impaired at discriminating anger from fear, but experienced less difficulty with continua varying from male to female, between familiar identities, and from happiness to sadness. In a further test of recognition of facial expressions of basic emotions from the Ekman and Friesen (1976) series, interpolated images were created for six continua that lay around the perimeter of an emotion hexagon (happiness-surprise; surprise-fear; fear-sadness; sadness-disgust; disgust-anger; anger-happiness). In deciding which emotion these morphed images were most like, people with Huntington's disease again showed deficits in the recognition of anger and fear, and an especially severe problem with disgust, which was recognized only at chance level. A follow-up study with tests of facially and vocally expressed emotions confirmed that the recognition of disgust was markedly poor for the Huntington's disease group, still being no better than chance level. Questionnaires were also used to examine self-assessed emotion, but did not show such striking problems. Taken together, these data reveal severe impairments of emotion recognition in Huntington's disease, and show that the recognition of some emotions is more impaired than others. The possibility that certain basic emotions may have dedicated neural substrates needs to be seriously considered: among these, disgust is a prime candidate.

482 citations


Journal ArticleDOI
TL;DR: Low-oil-content, fully lubricated leather is produced by treatment of leather with water emulsions of mixtures of alkanolamine soaps, oils, surfactants having HLB values between 2 and 6, and coupling solvents.
Abstract: An approach to the analysis and representation of facial dynamics for recognition of facial expressions from image sequences is presented. The algorithms utilize optical flow computation to identify the direction of rigid and nonrigid motions that are caused by human facial expressions. A mid-level symbolic representation motivated by psychological considerations is developed. Recognition of six facial expressions, as well as eye blinking, is demonstrated on a large set of image sequences.

453 citations


Journal ArticleDOI
TL;DR: In this article, the authors report four experiments that replicated and extended Etcoff and Magee's findings with photographic-quality stimuli and demonstrate better discrimination of cross-boundary than within-category pairs; that is, two faces identified as different expressions were easier to discriminate than two faces of equal proportions.
Abstract: Using computer-generated line-drawings, Etcoff and Magee (1992) found evidence of categorical perception of facial expressions. We report four experiments that replicated and extended Etcoff and Magee's findings with photographic-quality stimuli. Experiments 1 and 2 measured identification of the individual stimuli falling along particular expression continua (e.g. from happiness to sadness) and discrimination of these stimuli with an ABX task in which stimuli A, B, and X were presented sequentially; subjects had to decide whether X was the same as A or B. Our identification data showed that each expression continuum was perceived as two distinct sections separated by a category boundary. From these identification data we were able to predict subjects' performance in the ABX discrimination task and to demonstrate better discrimination of cross-boundary than within-category pairs; that is, two faces identified as different expressions (e.g. happy and sad) were easier to discriminate than two faces of equal...

390 citations


Journal ArticleDOI
TL;DR: In this paper, situational rather than facial information was predicted to determine the judged emotion, and this prediction was supported in each of the 22 cases examined (e.g., a person in a frightening situation but displaying a reported "facial expression of anger" was judged as afraid).
Abstract: Certain facial expressions have been theorized to be easily recognizable signals of specific emotions. If so, these expressions should override situationally based expectations used by a person in attributing an emotion to another. An alternative account is offered in which the face provides information relevant to emotion but does not signal a specific emotion. Therefore, in specified circumstances, situational rather than facial information was predicted to determine the judged emotion. This prediction was supported in 3 studies--indeed, in each of the 22 cases examined (e.g., a person in a frightening situation but displaying a reported "facial expression of anger" was judged as afraid). Situational information was especially influential when it suggested a nonbasic emotion (e.g., a person in a painful situation but displaying a "facial expression of fear" was judged as in pain).

Journal ArticleDOI
TL;DR: A disjunction among emotional response domains for schizophrenic patients is suggested; alternative explanations for the findings are considered as well as suggestions for future research.
Abstract: Recent research has found a discrepancy between schizophrenic patients' outward expression of emotion and their reported emotional experience. In this study, which attempts to replicate and extend the findings of previous studies, participants with and without schizophrenia viewed emotional film clips while their facial expressions were videotaped and skin conductance was recorded. Participants also reported their subjective experience of emotion following each film. Those with schizophrenia were less facially expressive than controls during the emotional films and reported experiencing as much positive and negative emotion, replicating previous findings. Additionally, schizophrenic patients exhibited greater skin conductance reactivity to all films than controls. These findings suggest a disjunction among emotional response domains for schizophrenic patients; alternative explanations for the findings are considered as well as suggestions for future research.

Journal ArticleDOI
TL;DR: For example, this article found that the superior conditioning to angry faces is stronger for male than for female faces, for adult than for child faces, and for faces directed toward the receiver rather than directed away.
Abstract: The complex musculature of the human face has been shaped by natural selection to produce gestures that communicate information about intentions and emotional states between senders and receivers. According to the preparedness hypothesis, different facial gestures are differentially prepared by evolution to become associated with different outcomes. As attested by psychophysiological responses in Pavlovian conditioning experiments, expressions of anger and fear more easily become signals for aversive stimuli than do expression of happiness. Consistent with the evolutionary perspective, the superior conditioning to angry faces is stronger for male than for female faces, for adult than for child faces, and for faces directed toward the receiver rather than directed away. Furthermore, it appears to be primarily located in the right cerebral hemisphere. The enhanced autonomic activity to angry faces signaling electric shock is not mediated by conscious cognitive activity, but is evident also when recognition of the facial stimulus is blocked by backward masking procedures. Similarly, conditioned responses can be established to masked angry, but not to masked happy faces. Electromyographic measurement of facial muscle activity reveals a tendency for emotional facial expression to rapidly and automatically elicit its mirror image in the face of the receiver, typically accompanied by the appropriate emotional experience. The research reviewed in this paper supports the proposition that humans have been evolutionarily tuned to respond automatically to facial stimuli, and it is suggested that such early automatic reactions shape the subsequent conscious emotional processing of the stimulus.

Journal ArticleDOI
TL;DR: It was found that fearful vocal emotional signals, when presented without facial signals, were sufficient to elicit appropriate behavior regulation and sex was a factor in the few effects that were found for infants' responses to facial emotional signals.
Abstract: The independent effects of facial and vocal emotional signals and of positive and negative signals on infant behavior were investigated in a novel toy social referencing paradigm. 90 12-month-old infants and their mothers were assigned to an expression condition (neutral, happy, or fear) nested within a modality condition (face-only or voice-only). Each infant participated in 3 trials: a baseline trial, an expression trial, and a final positive trial. We found that fearful vocal emotional signals, when presented without facial signals, were sufficient to elicit appropriate behavior regulation. Infants in the fear-voice condition looked at their mothers longer, showed less toy proximity, and tended to show more negative affect than infants in the neutral-voice condition. Happy vocal signals did not elicit differential responding. The infants' sex was a factor in the few effects that were found for infants' responses to facial emotional signals.

Journal ArticleDOI
TL;DR: The results indicate that subjects evaluate and integrate information from both modalities to perceive emotion, and the fuzzy logical model of perception (FLMP) fit the judgments significantly better than an additive model, which weakens theories based on an additive combination of modalities, categorical perception, and influence from only a single modality.
Abstract: This experiment examines how emotion is perceived by using facial and vocal cues of a speaker. Three levels of facial affect were presented using a computer-generated face. Three levels of vocal affect were obtained by recording the voice of a male amateur actor who spoke a semantically neutral word in different simulated emotional states. These two independent variables were presented to subjects in all possible permutations-visual cues alone, vocal cues alone, and visual and vocal cues together-which gave a total set of 15 stimuli. The subjects were asked to judge the emotion of the stimuli in a two-alternative forced choice task (either HAPPY or ANGRY). The results indicate that subjects evaluate and integrate information from both modalities to perceive emotion. The influence of one modality was greater to the extent that the other was ambiguous (neutral). The fuzzy logical model of perception (FLMP) fit the judgments significantly better than an additive model, which weakens theories based on an additive combination of modalities, categorical perception, and influence from only a single modality.

Journal ArticleDOI
TL;DR: Only when the infants saw the happy and fearful faces did the components differ for the two expressions, and these results are discussed in the context of the neurobiological processes involved in preceiving facial expressions.
Abstract: An extensive literature documents the infant's ability to recognize and discriminate a variety of facial expressions of emotion However, little is known about the neural bases of this ability To examine the neural processes that may underlie infants' responses to facial expressions, we recorded event-related potentials (ERPs) while 7-month-olds watched pictures of a happy face and a fearful face (Experiment 1) or an angry face and a fearful face (Experiment 2) In both experiments an early positive component, a middle-latency negative component and a later positive component were elicited However, only when the infants saw the happy and fearful faces did the components differ for the two expressions These results are discussed in the context of the neurobiological processes involved in preceiving facial expressions

Journal ArticleDOI
TL;DR: In this article, the authors evaluate four facial feedback hypotheses, each of which proposes a certain relation between the face and emotions, and conclude that facial action is not necessary for emotions.
Abstract: This review evaluates four facial feedback hypotheses, each proposing a certain relation between the face and emotions. It addresses criticisms of the data, considers implications for emotional and social processes, and advises directions for future research. The current data support the following: Facial actions are sensitive to social context, yet correspond to the affective dimension of emotions; matches with specific emotions are unlikely. They modulate ongoing emotions, and initiate them. These two claims have received substantially improved support, in part due to studies controlling for effects of experimental demand and task difficulty. Facial action may influence the occurrence of specific emotions, not simply their valence and intensity. Facial action is not necessary for emotions. There are multiple and nonmutually exclusive plausible mechanisms for facial effects on emotions. Future work must focus on determining the relative contributions of these mechanisms, and the parameters of their effects on emotions.

Proceedings ArticleDOI
11 Nov 1996
TL;DR: Experimental results obtained demonstrate that personified interfaces help users engage in a task, and are well suited for an entertainment domain, and that there is a dichotomy between user groups which have opposite opinions about personification.
Abstract: It is still an open question whether software agents should be personified in the interface. In order to study the effects of faces and facial expressions in the interface a series of experiments was conducted to compare subjects' responses to and evaluation of different faces and facial expressions. The experimental results obtained demonstrate that: (1) personified interfaces help users engage in a task, and are well suited for an entertainment domain; (2) people's impressions of a face in a task are different from ones of the face in isolation. Perceived intelligence of a face is determined not by the agent's appearance but by its competence; (3) there is a dichotomy between user groups which have opposite opinions about personification. Thus, agent-based interfaces should be flexible to support the diversity of users' preferences and the nature of tasks.

Journal ArticleDOI
TL;DR: Results showed that medicated patients performed more poorly than controls overall; however, they performed no worse on facial emotion perception tasks than on a matched control task, which support Kerr and Neale's conclusion that schizophrenic patients do not have a differential deficit in facial emotion Perception ability.
Abstract: Previous studies showing that schizophrenic patients have a deficit in the ability to perceive facial expressions of emotion in others often have not used a differential deficit design and standardized measures of emotion perception. Using standardized and cross-validated measures in a differential deficit design, S. L. Kerr and J. M. Neale (1993) found no evidence for a deficit specific to emotion perception among unmedicated schizophrenic patients. The present study replicated and extended the findings of Kerr and Neale in a sample of medicated schizophrenic patients. Results showed that medicated patients performed more poorly than controls overall; however, they performed no worse on facial emotion perception tasks than on a matched control task. These findings support Kerr and Neale's conclusion that schizophrenic patients do not have a differential deficit in facial emotion perception ability. Future research should examine the nature of schizophrenic patients generalized poor performance on tests of facial emotion perception.

Journal ArticleDOI
TL;DR: The representation of emotional faces over a delay period, compared to either the nonemotional or the fixation condition, was associated with significant activation in the left ventral prefrontal cortex, the left anterior cingulate cortex, and the right fusiform gyrus.

Journal ArticleDOI
TL;DR: This minor surgical procedure can temporarily reduce the lines on the upper face and produce a pleasing effect and with proper dosing and dilution this rejuvenation program becomes cost effective.
Abstract: backgroundBotulinum toxin has been used for facial hemispasm, strabismus, and blepharospasm. Recently it has been advocated to treat the frown lines. We have extended this program to treatment of other muscles of facial expression.methodsBotulinum toxin is injected into the muscles of facial express

Journal ArticleDOI
TL;DR: It is suggested that D.R. was poor at recognising emotional facial expressions, both in static and moving stimuli, and her problems in processing facial expressions included impaired knowledge of the patterning of facial features in each emotion.

Journal ArticleDOI
TL;DR: In this paper, the authors showed that isolated kinematic properties of visible speech can provide information for lip reading, and that these images can influence auditory speech independently of the participant's knowledge of the stimuli.
Abstract: Isolated kinematic properties of visible speech can provide information for lip reading. Kinematic facial information is isolated by darkening an actor's face and attaching dots to various articulators so that only moving dots can be seen with no facial features present. To test the salience of these images, the authors conducted experiments to determine whether the images could visually influence the perception of discrepant auditory syllables. Results showed that these images can influence auditory speech independently of the participant's knowledge of the stimuli. In other experiments, single frozen frames of visible syllables were presented with discrepant auditory syllables to test the salience of static facial features. Although the influence of the kinematic stimuli was perceptual, any influence of the static featural stimuli was likely based on participant's misunderstanding or postperceptual response bias.

Journal ArticleDOI
TL;DR: It is suggested that the low degree of facial asymmetry found in normal people does not affect attractiveness ratings (except for old age), probably because observers are not tuned to perceive it.
Abstract: This study examined the role of facial symmetry in the judgment of physical attractiveness. Four experiments investigated people's preference for either somewhat asymmetrical portraits or their symmetrical chimeric composites when presented simultaneously. Experiment 1 found a higher selection rate for symmetrical faces with neutral expression for portraits of old people, and Experiment 2 indicated this may be because symmetry serves as cue for youth in old age. In Experiments 3 and 4 participants examined portraits with emotional expressions. Experiment 3 found a higher selection rate for asymmetrical faces, and Experiment 4 indicated this may be because observers perceived them as more genuine and natural. This study suggests that the low degree of facial asymmetry found in normal people does not affect attractiveness ratings (except for old age), probably because observers are not tuned to perceive it.

Proceedings ArticleDOI
03 Oct 1996
TL;DR: It is suggested that eyebrow movements and fundamental frequency changes are not automatically linked (i.e., they are not the result of muscular synergy), but are more a consequence of linguistic and communicational choices.
Abstract: Speech production is always accompanied by facial and gestural activity. The study is part of a broader research project on how head movements and facial expressions are related to voice variations in different speech situations. Ten normal subjects were recorded while reading aloud answering yes/no questions, and dialoguing with an interviewer. Rapid rising-falling eyebrow movements produced by the subjects as they spoke were associated with Fo rises in only 71% of the cases. This suggests that eyebrow movements and fundamental frequency changes are not automatically linked (i.e., they are not the result of muscular synergy), but are more a consequence of linguistic and communicational choices. Note also that 38% of the eyebrow movements were produced while the subject was not speaking. Thus, eyebrow movements may also serve as back-channel signals or play a role in turn-taking during conversation.



Journal ArticleDOI
TL;DR: Participants' nonverbal expression of facial affect when learning about the target person reflected the overall tone of their significant-other representation under the condition of significant- other resemblance, providing strong support for schema-triggered affect in transference, through the use of this unobtrusive, nonverbal measure.
Abstract: Recent research has demonstrated transference in social perception, defined in terms of memory and schema-triggered evaluation in relation to a new person (S. M. Andersen & A. B. Baum, 1994; S. M. Andersen & S. W. Cole, 1990; S. M. Andersen, N. S. Glassman, S. Chen, & S. W. Cole, 1995). The authors examined schema-triggered facial affect in transference, along with motivations and expectancies. In a nomothetic experimental design, participants encountered stimulus descriptors of a new target person that were derived either from their own idiographic descriptions of a positively toned or a negatively toned significant other or from a yoked control participant's descriptors. Equal numbers of positive and negative target descriptors were presented, regardless of the overall tone of the representation. The results verified the memory effect and schema-triggered evaluation in transference, on the basis of significant-other resemblance in the target person. Of importance, participants' nonverbal expression of facial affect when learning about the target person (i.e., at encoding) reflected the overall tone of their significant-other representation under the condition of significant-other resemblance, providing strong support for schema-triggered affect in transference, through the use of this unobtrusive, nonverbal measure. Parallel effects on interpersonal closeness motivation and expectancies for acceptance/rejection in transference also emerged.

Journal ArticleDOI
TL;DR: Spontaneous facial expression appears to be selectively affected in PD, whereas posed expression and emotional experience remain relatively intact.
Abstract: Spontaneous and posed emotional facial expressions in individuals with Parkinson's disease (PD, n – 12) were compared with those of healthy age-matched controls (n = 12). The intensity and amount of facial expression in PD patients were expected to be reduced for spontaneous but not posed expressions. Emotional stimuli were video clips selected from films, 2–5 min in duration, designed to elicit feelings of happiness, sadness, fear, disgust, or anger. Facial movements were coded using Ekman and Friesen's (1978) Facial Action Coding System (FACS). In addition, participants rated their emotional experience on 9-point Likert scales. The PD group showed significantly less overall facial reactivity than did controls when viewing the films. The predicted Group X Condition (spontaneous vs. posed) interaction effect on smile intensity was found when PD participants with more severe disease were compared with those with milder disease and with controls. In contrast, ratings of emotional experience were similar for both groups. Depression was positively associated with emotion ratings, but not with measures of facial activity. Spontaneous facial expression appears to be selectively affected in PD, whereas posed expression and emotional experience remain relatively intact. (JINS, 1996, 2, 383–391.)

Proceedings ArticleDOI
25 Aug 1996
TL;DR: This paper presents a robust approach for the extraction of facial regions and features out of color images based on the color and shape information and results are shown for two example scenes.
Abstract: There are many applications for systems coping with the problem of face localization and recognition, e.g. model-based video coding, security systems and mug shot matching. Due to variations in illumination, back-ground, visual angle and facial expressions, the problem of machine face recognition is complex. In this paper we present a robust approach for the extraction of facial regions and features out of color images. First, face candidates are located based on the color and shape information. Then the topographic grey-level relief of facial regions is evaluated to determine the position of facial features as eyes and month. Results are shown for two example scenes.