scispace - formally typeset
Search or ask a question
Author

Rainer Banse

Bio: Rainer Banse is an academic researcher from University of Bonn. The author has contributed to research in topics: Implicit-association test & Implicit attitude. The author has an hindex of 39, co-authored 94 publications receiving 7335 citations. Previous affiliations of Rainer Banse include Humboldt University of Berlin & University of Mainz.


Papers
More filters
Journal ArticleDOI
TL;DR: Findings on decoding replicate earlier findings on the ability of judges to infer vocally expressed emotions with much-better-than-chance accuracy, including consistently found differences in the recognizability of different emotions.
Abstract: Professional actors' portrayals of 14 emotions varying in intensity and valence were presented to judges. The results on decoding replicate earlier findings on the ability of judges to infer vocally expressed emotions with much-better-than-chance accuracy, including consistently found differences in the recognizability of different emotions. A total of 224 portrayals were subjected to digital acoustic analysis to obtain profiles of vocal parameters for different emotions. The data suggest that vocal parameters not only index the degree of intensity typical for different emotions but also differentiate valence or quality aspects. The data are also used to test theoretical predictions on vocal patterning based on the component process model of emotion (K.R. Scherer, 1986). Although most hypotheses are supported, some need to be revised on the basis of the empirical evidence. Discriminant analysis and jackknifing show remarkably high hit rates and patterns of confusion that closely mirror those found for listener-judges.

1,862 citations

Journal ArticleDOI
TL;DR: In this paper, the authors report the results from a study conducted in nine countries in Europe, the United States, and Asia on vocal emotion portrayals of anger, sadness, fear, joy, and neutral voice as produced by professional German actors.
Abstract: Whereas the perception of emotion from facial expression has been extensively studied cross-culturally, little is known about judges’ ability to infer emotion from vocal cues. This article reports the results from a study conducted in nine countries in Europe, the United States, and Asia on vocal emotion portrayals of anger, sadness, fear, joy, and neutral voice as produced by professional German actors. Data show an overall accuracy of 66% across all emotions and countries. Although accuracy was substantially better than chance, there were sizable differences ranging from 74% in Germany to 52% in Indonesia. However, patterns of confusion were very similar across all countries. These data suggest the existence of similar inference rules from vocal expression across cultures. Generally, accuracy decreased with increasing language dissimilarity from German in spite of the use of language-free speech samples. It is concluded that culture- and language-specific paralinguistic patterns may influence the decodi...

663 citations

Journal ArticleDOI
TL;DR: Using the trait of shyness as an example, the authors showed that it is possible to reliably assess individual differences in the implicitly measured self-concept of personality that are not accessible through traditional explicit self-ratings and increase significantly the prediction of spontaneous behavior in realistic social situations.
Abstract: Using the trait of shyness as an example, the authors showed that (a) it is possible to reliably assess individual differences in the implicitly measured self-concept of personality that (b) are not accessible through traditional explicit self-ratings and (c) increase significantly the prediction of spontaneous behavior in realistic social situations. A total of 139 participants were observed in a shyness-inducing laboratory situation, and they completed an Implicit Association Test (IAT) and explicit self-ratings of shyness. The IAT correlated moderately with the explicit self-ratings and uniquely predicted spontaneous (but not controlled) shy behavior, whereas the explicit ratings uniquely predicted controlled (but not spontaneous) shy behavior (double dissociation). The distinction between spontaneous and controlled behavior was validated in a 2nd study.

645 citations

Journal ArticleDOI
TL;DR: Two experiments were conducted to investigate the psychometric properties of an Implicit Association Test that was adapted to measure implicit attitudes towards homosexuality, and it was shown that uninformed participants were able to fake positive explicit but not implicit attitudes.
Abstract: Two experiments were conducted to investigate the psychometric properties of an Implicit Association Test (IAT; Greenwald, McGhee, & Schwartz, 1998) that was adapted to measure implicit attitudes towards homosexuality. In a first experiment, the validity of the Homosexuality-IAT was tested using a known group approach. Implicit and explicit attitudes were assessed in heterosexual and homosexual men and women (N = 101). The results provided compelling evidence for the convergent and discriminant validity of the Homosexuality-IAT as a measure of implicit attitudes. No evidence was found for two alternative explanations of IAT effects (familiarity with stimulus material and stereotype knowledge). The internal consistency of IAT scores was satisfactory (alpha s > .80), but retest correlations were lower. In a second experiment (N = 79) it was shown that uninformed participants were able to fake positive explicit but not implicit attitudes. Discrepancies between implicit and explicit attitudes towards homosexuality could be partially accounted for by individual differences in the motivation to control prejudiced behavior, thus providing independent evidence for the validity of the implicit attitude measure. Neither explicit nor implicit attitudes could be changed by persuasive messages. The results of both experiments are interpreted as evidence for a single construct account of implicit and explicit attitudes towards homosexuality.

458 citations

Journal ArticleDOI
TL;DR: In this paper, the correspondence between theoretical predictions on vocal expression patterns in naturally occurring emotions and empirical data on the acoustic characteristics of actors' portrayals was examined and the results for the acoustic parameters extracted from the speech signal show a number of significant differences between emotions, generally confirming the theoretical predictions.
Abstract: This research examines the correspondence between theoretical predictions on vocal expression patterns in naturally occurring emotions (as based on the component process theory of emotion; Scherer, 1986) and empirical data on the acoustic characteristics of actors' portrayals Two male and two female professional radio actors portrayed anger, sadness, joy, fear, and disgust based on realistic scenarios of emotion-eliciting events A series of judgment studies was conducted to assess the degree to which judges are able to recognize the intended emotion expressions Disgust was relatively poorly recognized; average recognition accuracy for the other emotions attained 628% across studies A set of portrayals reaching a satisfactory level of recognition accuracy underwent digital acoustic analysis The results for the acoustic parameters extracted from the speech signal show a number of significant differences between emotions, generally confirming the theoretical predictions

383 citations


Cited by
More filters
Journal Article

5,680 citations

Journal ArticleDOI
TL;DR: The best-performing measure incorporates data from the IAT's practice trials, uses a metric that is calibrated by each respondent's latency variability, and includes a latency penalty for errors, and strongly outperforms the earlier (conventional) procedure.
Abstract: In reporting Implicit Association Test (IAT) results, researchers have most often used scoring conventions described in the first publication of the IAT (A.G. Greenwald, D.E. McGhee, & J.L.K. Schwartz, 1998). Demonstration IATs available on the Internet have produced large data sets that were used in the current article to evaluate alternative scoring procedures. Candidate new algorithms were examined in terms of their (a) correlations with parallel self-report measures, (b) resistance to an artifact associated with speed of responding, (c) internal consistency, (d) sensitivity to known influences on IAT measures, and (e) resistance to known procedural influences. The best-performing measure incorporates data from the IAT's practice trials, uses a metric that is calibrated by each respondent's latency variability, and includes a latency penalty for errors. This new algorithm strongly outperforms the earlier (conventional) procedure.

5,049 citations

Journal ArticleDOI
James A. Russell1
TL;DR: At the heart of emotion, mood, and any other emotionally charged event are states experienced as simply feeling good or bad, energized or enervated, which influence reflexes, perception, cognition, and behavior.
Abstract: At the heart of emotion, mood, and any other emotionally charged event are states experienced as simply feeling good or bad, energized or enervated. These states--called core affect--influence reflexes, perception, cognition, and behavior and are influenced by many causes internal and external, but people have no direct access to these causal connections. Core affect can therefore be experienced as free-floating (mood) or can be attributed to some cause (and thereby begin an emotional episode). These basic processes spawn a broad framework that includes perception of the core-affect-altering properties of stimuli, motives, empathy, emotional meta-experience, and affect versus emotion regulation; it accounts for prototypical emotional episodes, such as fear and anger, as core affect attributed to something plus various nonemotional processes.

4,585 citations

Journal ArticleDOI
TL;DR: A review of 122 research reports (184 independent samples, 14,900 subjects) found average r =.274 for prediction of behavioral, judgment, and physiological measures by Implicit Association Test (IAT) measures as mentioned in this paper.
Abstract: This review of 122 research reports (184 independent samples, 14,900 subjects) found average r = .274 for prediction of behavioral, judgment, and physiological measures by Implicit Association Test (IAT) measures. Parallel explicit (i.e., self-report) measures, available in 156 of these samples (13,068 subjects), also predicted effectively (average r = .361), but with much greater variability of effect size. Predictive validity of self-report was impaired for socially sensitive topics, for which impression management may distort self-report responses. For 32 samples with criterion measures involving Black-White interracial behavior, predictive validity of IAT measures significantly exceeded that of self-report measures. Both IAT and self-report measures displayed incremental validity, with each measure predicting criterion variance beyond that predicted by the other. The more highly IAT and self-report measures were intercorrelated, the greater was the predictive validity of each.

2,690 citations

Journal ArticleDOI
TL;DR: In this paper, the authors discuss human emotion perception from a psychological perspective, examine available approaches to solving the problem of machine understanding of human affective behavior, and discuss important issues like the collection and availability of training and test data.
Abstract: Automated analysis of human affective behavior has attracted increasing attention from researchers in psychology, computer science, linguistics, neuroscience, and related disciplines. However, the existing methods typically handle only deliberately displayed and exaggerated expressions of prototypical emotions despite the fact that deliberate behaviour differs in visual appearance, audio profile, and timing from spontaneously occurring behaviour. To address this problem, efforts to develop algorithms that can process naturally occurring human affective behaviour have recently emerged. Moreover, an increasing number of efforts are reported toward multimodal fusion for human affect analysis including audiovisual fusion, linguistic and paralinguistic fusion, and multi-cue visual fusion based on facial expressions, head movements, and body gestures. This paper introduces and surveys these recent advances. We first discuss human emotion perception from a psychological perspective. Next we examine available approaches to solving the problem of machine understanding of human affective behavior, and discuss important issues like the collection and availability of training and test data. We finally outline some of the scientific and engineering challenges to advancing human affect sensing technology.

2,503 citations