scispace - formally typeset
Proceedings ArticleDOI

Evaluating the Stressful Commutes Using Physiological Signals and Machine Learning Techniques

TLDR
In this paper , several machine learning algorithms are applied to design a model which can predict the effect of a commute and the results obtained from the employed machine learning algorithm predict that heart rate difference before and after commute will correlate with EEG signals in participants who have selfreported to be stress after the commute.
Abstract
Stress can be described as an alteration in our body that can cause strain emotionally, physically, or psychologically. It is a reaction from our body to something that demands attention or exertion. It can be caused by various reasons depending on the physical or mental activity of the body. Commuting on a regular basis also acts as a source of stress. This research aims to explore the physiological effects of the commute with an application of a machine-learning algorithm. The data used in this research is collected from 45 healthy participants who commute to work on a regular basis. A multimodal dataset containing medical data like biosignals (heart rate, blood pressure, and EEG signal) plus responses obtained from the questionnaire PANAS. Evaluation is based on the performance metrics that include confusion matrix, ROC/AUC, and classification accuracy of the model. In this research, several machine learning algorithms are applied to design a model which can predict the effect of a commute. The results obtained from this research suggest that whether the interval of commute was small or large, there was a significant rise in stress levels including the bio-signals (electroencephalogram, blood pressure and heart rate) after the commute. The results obtained from the employed machine learning algorithms predict that heart rate difference before and after commute will correlate with EEG signals in participants who have self-reported to be stress after the commute. The random forest algorithm gave a very promising result with an accuracy of 91%, while the KNN and the SVM showed the accuracy of 78% and 80% respectively.

read more

Content maybe subject to copyright    Report

References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

Development and validation of brief measures of positive and negative affect: The PANAS scales.

TL;DR: Two 10-item mood scales that comprise the Positive and Negative Affect Schedule (PANAS) are developed and are shown to be highly internally consistent, largely uncorrelated, and stable at appropriate levels over a 2-month time period.
Journal Article

Dropout: a simple way to prevent neural networks from overfitting

TL;DR: It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.
Journal ArticleDOI

ImageNet classification with deep convolutional neural networks

TL;DR: A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Proceedings Article

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin.