DEAP: A Database for Emotion Analysis ;Using Physiological Signals
Summary (1 min read)
Summary
- Limitations, fiture research, and implications are discussed.
- A child who has a dz#cult temperament may not evidence problem behavior ifthey haveparents who have outstandingparent management skills and are not impacted by family stressors;.
- Causal risk factors can be changed and, when changed, they alter the risk of outcome.
- Parent management training has been found to improve the social functioning of children at risk for ElBD (Patterson, 1982).
- Previous research has focused on clinically identified populations (e.g., Walrath et a]., 2004).
- The authors approved Institutional Review Board procedures did not require that the authors obtain child assent.
- The screening process for kindergarten and first-grade participants included the first and second gates of the Early Screening Project (ESP; Walker, Severson, & Feil, 1995) and Systematic Screening for Behavior Disorders (SSBD; Walker & Severson, 1990), respectively.
- The Parental Distress score also reflects the presence of parental depression.
- Staffparticipated in a total of 50 hr of supervised training and practice to administer the measures as well as child outcome measures used in their study of a three-tiered behavior prevention model.
Did you find this useful? Give us your feedback
Citations
1,162 citations
Cites background from "DEAP: A Database for Emotion Analys..."
...An area of commerce that could obviously benefit from an automatic understanding of human emotional experience is the multimedia sector....
[...]
1,131 citations
Additional excerpts
...The DEAP dataset includes the EEG and peripheral physiological signals of 32 participants when watching 40 one-minute music videos....
[...]
...To the best of our knowledge, the popular publicly available emotional EEG datasets are MAHNOB HCI [3] and DEAP [45]....
[...]
969 citations
937 citations
Cites background or methods from "DEAP: A Database for Emotion Analys..."
...The Database for Emotion Analysis using Physiological Signals (DEAP) [26] consists of spontaneous reactions of 32 participants in response to one-minute long music video clip....
[...]
...[26] DEAP - Gaussian naive Bayes classifier - EEG, physiological signals, and multimedia features - Binary classification of low/high arousal, valence, and liking - 0....
[...]
...DEAP [26] - 40 one-minute long videos shown to subjects - EEG signals recorded - 32 - Controlled - Spontaneous - Valence and arousal (continuous) - Self assessment...
[...]
...TheDatabase for Emotion Analysis using Physiological Signals (DEAP) [26] consists of spontaneous reactions of 32 participants in response to one-minute long music video clip....
[...]
...DEAP is a great database to study the relation of biological signals and dimensional affect, however, it has only a few subjects and the videos are captured in lab controlled settings....
[...]
777 citations
Cites background or methods from "DEAP: A Database for Emotion Analys..."
...The datasets that are used in more than one study within this review include an emotion recognition dataset ([56], various subsets used in ten studies), a mental workload dataset ([57], used in four studies), two motor imagery datasets from the same BCI competition ([58], one dataset used in six studies and the second dataset used in two studies), a seizure detection dataset ([33], used in three studies), and two sleep stage scoring datasets ([59, 60], which were used in groups of two studies and three studies, respectively)....
[...]
...The common emotion recognition dataset (DEAP) [56], the dataset analyzed by the highest number of studies, is a collection of EEG and peripheral signals from 32 subjects participating in a human affective state task....
[...]
...A comparison of architecture and input choices across studies using the publicly available DEAP [56] dataset....
[...]
References
12,519 citations
Additional excerpts
...Ç...
[...]
7,472 citations
"DEAP: A Database for Emotion Analys..." refers methods in this paper
...For self-assessment along these scales, we use the wellknown self-assessment manikins (SAM) [10]....
[...]
...For selfassessment along these scales, we use the well-known selfassessment manikins (SAM) [10]....
[...]
...From the top: Valence SAM, arousal SAM, dominance SAM, and liking....
[...]
[...]
5,700 citations
Additional excerpts
...are related to valence [58]....
[...]
5,359 citations
Additional excerpts
...Ç...
[...]
Related Papers (5)
Frequently Asked Questions (11)
Q2. What are the contributions mentioned in the paper "Deap: a database for emotion analysis using physiological signals" ?
The authors present a multimodal data set for the analysis of human affective states. An extensive analysis of the participants ' ratings during the experiment is presented.
Q3. What test was used to test for significance?
To test for significance, an independent one-sample t-test was performed, comparing the F1-distribution over participants to the 0.5 baseline.
Q4. What other features have been shown to be correlated with valence?
There are other content features such as color variance and key lighting that have been shown to be correlated with valence [30].
Q5. What are the four quadrants of the valence-arousal space?
The valence-arousal space can be subdivided into 4 quadrants, namely low arousal/low valence (LALV), low arousal/high valence (LAHV), high arousal/low valence (HALV) and high arousal/high valence (HAHV).
Q6. How many videos were selected via Last.fm affective tags?
Of the 40 selected videos, 17 were selected via Last.fm affective tags, indicating that useful stimuli can be selected via this method.
Q7. What are the common types of emotional information used for emotion assessment?
Physiological signals are also known to include emotional information that can be used for emotion assessment but they have received less attention.
Q8. What are the two widely available databases for emotion assessment?
To the best of their knowledge, the only publicly available multi-modal emotional databases which includes both physiological responses and facial expressions are the enterface 2005 emotional database and MAHNOB HCI [4], [5].
Q9. What was the emotional highlight score of the i-th segment ei?
The emotional highlight score of the i-th segment ei was computed using the following equation:ei = √ a2 i + v2 i (1)The arousal, ai, and valence, vi, were centered.
Q10. How did the participants rate their familiarity with the songs?
after the experiment, participants were asked to rate their familiarity with each of the songs on a scale of 1 (”Never heard it before the experiment”) to 5 (”Knew the song very well”).
Q11. What was the arousal and valence level of each video?
The participants rated arousal and valence levels and the EEG and physiological signals for each video were classified into low/high arousal/valence classes.