Long Short Term Memory Hyperparameter Optimization for a Neural Network Based Emotion Recognition Framework
Reads0
Chats0
TLDR
This paper proposes a new framework to automatically optimize LSTM hyperparameters using differential evolution (DE), the first systematic study of hyperparameter optimization in the context of emotion classification, and evaluates and compares the proposed framework with other state-of-the-art hyper parameter optimization methods.Abstract:
Recently, emotion recognition using low-cost wearable sensors based on electroencephalogram and blood volume pulse has received much attention Long short-term memory (LSTM) networks, a special type of recurrent neural networks, have been applied successfully to emotion classification However, the performance of these sequence classifiers depends heavily on their hyperparameter values, and it is important to adopt an efficient method to ensure the optimal values To address this problem, we propose a new framework to automatically optimize LSTM hyperparameters using differential evolution (DE) This is the first systematic study of hyperparameter optimization in the context of emotion classification In this paper, we evaluate and compare the proposed framework with other state-of-the-art hyperparameter optimization methods (particle swarm optimization, simulated annealing, random search, and tree of Parzen estimators) using a new dataset collected from wearable sensors Experimental results demonstrate that optimizing LSTM hyperparameters significantly improve the recognition rate of four-quadrant dimensional emotions with a 14% increase in accuracy The best model based on the optimized LSTM classifier using the DE algorithm achieved 7768% accuracy The results also showed that evolutionary computation algorithms, particularly DE, are competitive for ensuring optimized LSTM hyperparameter values Although DE algorithm is computationally expensive, it is less complex and offers higher diversity in finding optimal solutionsread more
Citations
More filters
Journal ArticleDOI
Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review
TL;DR: The emotion recognition methods based on multi-channel EEG signals as well as multi-modal physiological signals are reviewed and the correlation between different brain areas and emotions is discussed.
Journal ArticleDOI
Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges
TL;DR: In this article , a comprehensive survey of the most important aspects of multi-sensor applications for human activity recognition, including those recently added to the field for unsupervised learning and transfer learning, is presented.
Journal ArticleDOI
Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges
Sen Qiu,Hongkai Zhao,Nan Jiang,Zhelong Wang,Long Liu,Yi An,Hongyu Zhao,Xin Miao,Ruichen Liu,Giancarlo Fortino +9 more
TL;DR: In this paper, a comprehensive survey of the most important aspects of multi-sensor applications for human activity recognition, including those recently added to the field for unsupervised learning and transfer learning, is presented.
Journal ArticleDOI
Automatic Driver Stress Level Classification Using Multimodal Deep Learning
TL;DR: A multimodal fusion model based on convolutional neural networks and long short-term memory to fuse the ECG, vehicle data and contextual data to jointly learn the highly correlated representation across modalities, after learning each modality, with a single deep network is proposed.
Journal ArticleDOI
Feature selection using regularized neighbourhood component analysis to enhance the classification performance of motor imagery signals.
Nitesh Singh Malan,Shiru Sharma +1 more
TL;DR: The present study proposes a feature selection algorithm based on neighbourhood component analysis (NCA) with modification of the regularization parameter that selects the lowest number of features and results in increased computational efficiency, which makes it a promising feature selection tool for an MI-based BCI system.
References
More filters
Journal ArticleDOI
Long short-term memory
TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings ArticleDOI
Speech recognition with deep recurrent neural networks
TL;DR: This paper investigates deep recurrent neural networks, which combine the multiple levels of representation that have proved so effective in deep networks with the flexible use of long range context that empowers RNNs.
Journal Article
Random search for hyper-parameter optimization
James Bergstra,Yoshua Bengio +1 more
TL;DR: This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid, and shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper- parameter optimization algorithms.
Book
Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series)
TL;DR: This volume explores the differential evolution (DE) algorithm in both principle and practice and is a valuable resource for professionals needing a proven optimizer and for students wanting an evolutionary perspective on global numerical optimization.
Posted Content
Speech Recognition with Deep Recurrent Neural Networks
TL;DR: In this paper, deep recurrent neural networks (RNNs) are used to combine the multiple levels of representation that have proved so effective in deep networks with the flexible use of long range context that empowers RNNs.