scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

A survey: Software API and database for emotion recognition

TL;DR: Various attributes, methods and emotional labels that are considered by various emotion API system are discussed and an overview of the databases available for inferring emotion through human facial features is given.
Abstract: Emotions are fundamental to human lives and their decision-making. Understanding an expression of emotional feeling between people forms an intricate web. There are systems, been developed that attempt to recognize aspects of emotion related behaviors and to respond to these, for example systems designed to improve the user experience or to change user behavior. The emotion recognition through facial images in recent times have proved to be an interesting topic for the researchers. Since two decades, a big research has been addressed to enhance Human Computer Interaction (HCI). There are a lot of applications and API-accessible software online that parallels the human ability to discern emotional behavior. The visual detection market is expanding enormously that can be seen noted from various system developed and described in this paper. This paper discuss various attributes, methods and emotional labels that are considered by various emotion API system. This paper also gives an overview of the databases available for inferring emotion through human facial features. The paper.
Citations
More filters
Journal ArticleDOI
24 Apr 2020-PLOS ONE
TL;DR: Testing eight out-of-the-box automatic classifiers for facial affect recognition revealed a recognition advantage for human observers over automatic classification, and the need for more spontaneous facial databases that can act as a benchmark in the training and testing of automatic emotion recognition systems.
Abstract: In the wake of rapid advances in automatic affect analysis, commercial automatic classifiers for facial affect recognition have attracted considerable attention in recent years. While several options now exist to analyze dynamic video data, less is known about the relative performance of these classifiers, in particular when facial expressions are spontaneous rather than posed. In the present work, we tested eight out-of-the-box automatic classifiers, and compared their emotion recognition performance to that of human observers. A total of 937 videos were sampled from two large databases that conveyed the basic six emotions (happiness, sadness, anger, fear, surprise, and disgust) either in posed (BU-4DFE) or spontaneous (UT-Dallas) form. Results revealed a recognition advantage for human observers over automatic classification. Among the eight classifiers, there was considerable variance in recognition accuracy ranging from 48% to 62%. Subsequent analyses per type of expression revealed that performance by the two best performing classifiers approximated those of human observers, suggesting high agreement for posed expressions. However, classification accuracy was consistently lower (although above chance level) for spontaneous affective behavior. The findings indicate potential shortcomings of existing out-of-the-box classifiers for measuring emotions, and highlight the need for more spontaneous facial databases that can act as a benchmark in the training and testing of automatic emotion recognition systems. We further discuss some limitations of analyzing facial expressions that have been recorded in controlled environments.

73 citations


Cites methods from "A survey: Software API and database..."

  • ...Given that automated methods for measuring facial expression patterns have now matured, 16 providers of commercially available classifiers have recently been identified [35, 36]....

    [...]

Proceedings ArticleDOI
27 Jan 2019
TL;DR: This work utilizes several datasets that contain facial expressions of children linked to their emotional state to evaluate eight different commercial emotion classification systems and identifies limitations associated with automated recognition of emotions in children and provides suggestions on directions with enhancing recognition accuracy through data diversification, dataset accountability, and algorithmic regulation.
Abstract: In recent news, organizations have been considering the use of facial and emotion recognition for applications involving youth such as tackling surveillance and security in schools. However, the majority of efforts on facial emotion recognition research have focused on adults. Children, particularly in their early years, have been shown to express emotions quite differently than adults. Thus, before such algorithms are deployed in environments that impact the wellbeing and circumstance of youth, a careful examination should be made on their accuracy with respect to appropriateness for this target demographic. In this work, we utilize several datasets that contain facial expressions of children linked to their emotional state to evaluate eight different commercial emotion classification systems. We compare the ground truth labels provided by the respective datasets to the labels given with the highest confidence by the classification systems and assess the results in terms of matching score (TPR), positive predictive value, and failure to compute rate. Overall results show that the emotion recognition systems displayed subpar performance on the datasets of children's expressions compared to prior work with adult datasets and initial human ratings. We then identify limitations associated with automated recognition of emotions in children and provide suggestions on directions with enhancing recognition accuracy through data diversification, dataset accountability, and algorithmic regulation.

21 citations


Cites background from "A survey: Software API and database..."

  • ...A non-exhaustive list of available emotion recognition systems, past and present, can be found in (Deshmukh and Jagtap 2017)....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors investigated whether these recordings can be acquired, using computer webcams, when testing products at home, and they showed that a protocol of facial expression measurements at home was feasible and provided conclusive results.

18 citations

Journal ArticleDOI
TL;DR: In this paper, the authors explore emotions in Instagram images marked with hashtags referring to body image-related components using an artificial intelligence-based discrete emotional analysis using a total of 50 images.
Abstract: Our aim was to explore emotions in Instagram images marked with hashtags referring to body image–related components using an artificial intelligence–based discrete emotional analysis A total of 50

18 citations


Cites methods from "A survey: Software API and database..."

  • ...The algorithm uses facial detection and semantic analysis to interpret mood from photos and videos both static and real time (Deshmukh & Jagtap, 2017)....

    [...]

Proceedings ArticleDOI
09 Mar 2021
TL;DR: In this article, the authors introduce a concept for an IoT semi-electronic display device which can detect the users emotions to identify initial signs of depression in the elderly, which is expected to appear in future smart healthcare systems.
Abstract: We introduce a concept for an IoT semi-electronic display device which can detect the users emotions to identify initial signs of depression in the elderly. Such a screening process is expected to appear in future smart healthcare systems. This device functions as a smart mirror which displays daily information such as the weather, events, headline news, currency rates, stocks, and reminders. Elderly users can view information on their mental health while dressing themselves in front of the mirror. The information on their daily emotions will be collected to monitor their health long-term as a nurse or caretaker in their house would do. Moreover, to observe the elderly users emotions, we incorporate additional systems such as a chat bot, facial recognition, voice/speech recognition, and posture recognition. The device also serves as a communication tool that facilitates telemedicine, helping the elderly users at home feel more connected with their doctors. This helps doctors diagnose, follow up, and treat their patients long-term

5 citations

References
More filters
Journal ArticleDOI
TL;DR: The present article presents the freely available Radboud Faces Database, a face database in which displayed expressions, gaze direction, and head orientation are parametrically varied in a complete factorial design, containing both Caucasian adult and children images.
Abstract: Many research fields concerned with the processing of information contained in human faces would benefit from face stimulus sets in which specific facial characteristics are systematically varied while other important picture characteristics are kept constant. Specifically, a face database in which displayed expressions, gaze direction, and head orientation are parametrically varied in a complete factorial design would be highly useful in many research domains. Furthermore, these stimuli should be standardised in several important, technical aspects. The present article presents the freely available Radboud Faces Database offering such a stimulus set, containing both Caucasian adult and children images. This face database is described both procedurally and in terms of content, and a validation study concerning its most important characteristics is presented. In the validation study, all frontal images were rated with respect to the shown facial expression, intensity of expression, clarity of expression, genuineness of expression, attractiveness, and valence. The results show very high recognition of the intended facial expressions.

2,041 citations


"A survey: Software API and database..." refers background in this paper

  • ...Radboud Faces Database (RaFD) [20] Neutral, Sadness, Contempt, Surprise, Happiness, Fear, Anger, And Disgust Posed Three different gaze directions and five camera angles (8*67*3*5=8040 images) 681*1024...

    [...]

Proceedings ArticleDOI
07 Mar 2016
TL;DR: OpenFace is the first open source tool capable of facial landmark detection, head pose estimation, facial action unit recognition, and eye-gaze estimation and allows for easy integration with other applications and devices through a lightweight messaging system.
Abstract: Over the past few years, there has been an increased interest in automatic facial behavior analysis and understanding. We present OpenFace — an open source tool intended for computer vision and machine learning researchers, affective computing community and people interested in building interactive applications based on facial behavior analysis. OpenFace is the first open source tool capable of facial landmark detection, head pose estimation, facial action unit recognition, and eye-gaze estimation. The computer vision algorithms which represent the core of OpenFace demonstrate state-of-the-art results in all of the above mentioned tasks. Furthermore, our tool is capable of real-time performance and is able to run from a simple webcam without any specialist hardware. Finally, OpenFace allows for easy integration with other applications and devices through a lightweight messaging system.

1,151 citations


"A survey: Software API and database..." refers methods in this paper

  • ...OpenFace [12] facial landmark detection and tracking: head pose detection, eye gaze estimation Conditional Local Neural Fields (CLNF) Face rectangle 95%...

    [...]

01 Jan 2007
TL;DR: There are two major perspectives on the origin of emotions: Evolutionary psychology and Constructive Evolutionary Psychology as discussed by the authors, and both of these approaches cannot be maintained and the dichotomy between the two approaches is not maintained.
Abstract: There are two major perspectives on the origin of emotions. According to one, emotions are the products of natural selection. They are evolved adaptations, best understood using the explanatory tools of evolutionary psychology. According to the other, emotions are socially constructed, and they vary across cultural boundaries. There is evidence supporting both perspectives. In light of this, some have argued both approaches are right. The standard strategy for compromise is to say that some emotions are evolved and others are constructed. The evolved emotions are sometimes given the label “basic,” and there is considerable agreement about a handful of emotions in this category. My goal here is to challenge all of these perspectives. I don’t think we should adopt a globally evolutionary approach, nor indulge the radical view that emotions derive entirely from us. I am equally dissatisfied with approaches that attempt to please Darwinians and constructivists by dividing emotions into two separate classes. I will defend another kind of ecumenicalism. Every emotion that we have a name for is the product of both nature and nurture. Emotions are evolved and constructed. The dichotomy between the two approaches cannot be maintained. This thesis will require making some claims that would be regarded as surprising to many emotion researchers. First, while there is a difference between basic emotions and nonbasic emotions, it is not a structural difference. All emotions are fundamentally alike. Second, the standard list of basic emotions, though by many to be universal across cultures, are not basic after all. We don’t have names for the basic emotions. All emotions that we talk about are culturally informed. And finally, this concession to constructivism does not imply that emotions are cognitive in any sense. Emotions are perceptual and embodied. They are gut reactions, and they are not unique to our species. To defend these heresies I will have to present a theory of what the emotions really are.

103 citations


Additional excerpts

  • ...There is evidence supporting both perspectives [1]....

    [...]