scispace - formally typeset
Journal ArticleDOI

EEG-Based Sleep Stage Classification via Neural Architecture Search

Reads0
Chats0
TLDR
Zhang et al. as discussed by the authors proposed a novel neural architecture search (NAS) framework based on bilevel optimization approximation for EEG-based sleep stage classification, and the model is optimized by search space approximation and search space regularization with parameters shared among cells.
Abstract
With the improvement of quality of life, people are more and more concerned about the quality of sleep. The electroencephalogram (EEG)-based sleep stage classification is a good guide for sleep quality and sleep disorders. At this stage, most automatic staging neural networks are designed by human experts, and this process is time-consuming and laborious. In this paper, we propose a novel neural architecture search (NAS) framework based on bilevel optimization approximation for EEG-based sleep stage classification. The proposed NAS architecture mainly performs the architectural search through a bilevel optimization approximation, and the model is optimized by search space approximation and search space regularization with parameters shared among cells. Finally, we evaluated the performance of the model searched by NAS on the Sleep-EDF-20, Sleep-EDF-78 and SHHS datasets with an average accuracy of 82.7%, 80.0% and 81.9%, respectively. The experimental results show that the proposed NAS algorithm provides some reference for the subsequent automatic design of networks for sleep classification.

read more

Citations
More filters
Journal ArticleDOI

ProductGraphSleepNet: Sleep Staging using Product Spatio-Temporal Graph Learning with Attentive Temporal Aggregation

TL;DR: In this paper , an adaptive product graph learning-based graph convolutional network is proposed for learning joint spatio-temporal graphs along with a bidirectional gated recurrent unit and a modified graph attention network to capture the attentive dynamics of sleep stage transitions.
References
More filters
Proceedings Article

Very Deep Convolutional Networks for Large-Scale Image Recognition

TL;DR: This work investigates the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting using an architecture with very small convolution filters, which shows that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 weight layers.
Posted Content

Deep Residual Learning for Image Recognition

TL;DR: This work presents a residual learning framework to ease the training of networks that are substantially deeper than those used previously, and provides comprehensive empirical evidence showing that these residual networks are easier to optimize, and can gain accuracy from considerably increased depth.
Journal ArticleDOI

ImageNet classification with deep convolutional neural networks

TL;DR: A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Proceedings ArticleDOI

Fully convolutional networks for semantic segmentation

TL;DR: The key insight is to build “fully convolutional” networks that take input of arbitrary size and produce correspondingly-sized output with efficient inference and learning.
Proceedings ArticleDOI

Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation

TL;DR: RCNN as discussed by the authors combines CNNs with bottom-up region proposals to localize and segment objects, and when labeled training data is scarce, supervised pre-training for an auxiliary task, followed by domain-specific fine-tuning, yields a significant performance boost.