scispace - formally typeset
Open AccessProceedings Article

Long Short Term Memory Networks for Anomaly Detection in Time Series.

TLDR
The efficacy of stacked LSTM networks for anomaly/fault detection in time series on ECG, space shuttle, power demand, and multi-sensor engine dataset is demonstrated.
Abstract
Long Short Term Memory (LSTM) networks have been demonstrated to be particularly useful for learning sequences containing longer term patterns of unknown length, due to their ability to maintain long term memory. Stacking recurrent hidden layers in such networks also enables the learning of higher level temporal features, for faster learning with sparser representations. In this paper, we use stacked LSTM networks for anomaly/fault detection in time series. A network is trained on non-anomalous data and used as a predictor over a number of time steps. The resulting prediction errors are modeled as a multivariate Gaussian distribution, which is used to assess the likelihood of anomalous behavior. The efficacy of this approach is demonstrated on four datasets: ECG, space shuttle, power demand, and multi-sensor engine dataset.

read more

Citations
More filters
Journal ArticleDOI

Deep learning for smart manufacturing: Methods and applications

TL;DR: A comprehensive survey of commonly used deep learning algorithms and discusses their applications toward making manufacturing “smart”, including computational methods based on deep learning that aim to improve system performance in manufacturing.
Journal ArticleDOI

Unsupervised real-time anomaly detection for streaming data

TL;DR: A novel anomaly detection algorithm is proposed that is based on an online sequence memory algorithm called Hierarchical Temporal Memory (HTM) and presented using the Numenta Anomaly Benchmark (NAB), a benchmark containing real-world data streams with labeled anomalies.
Posted Content

LSTM-based Encoder-Decoder for Multi-sensor Anomaly Detection

TL;DR: This work proposes a Long Short Term Memory Networks based Encoder-Decoder scheme for Anomaly Detection (EncDec-AD) that learns to reconstruct 'normal' time-series behavior, and thereafter uses reconstruction error to detect anomalies.
Posted Content

Deep Learning for Anomaly Detection: A Survey.

TL;DR: A structured and comprehensive overview of research methods in deep learning-based anomaly detection, grouped state-of-the-art research techniques into different categories based on the underlying assumptions and approach adopted.
Journal ArticleDOI

A Multimodal Anomaly Detector for Robot-Assisted Feeding Using an LSTM-Based Variational Autoencoder

TL;DR: In this paper, a long short-term memory-based variational autoencoder (LSTM-VAE) was proposed for anomaly detection in assistive manipulation.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

Detection of abrupt changes: theory and application

TL;DR: A unified framework for the design and the performance analysis of the algorithms for solving change detection problems and links with the analytical redundancy approach to fault detection in linear systems are established.
Proceedings Article

Training and Analysing Deep Recurrent Neural Networks

TL;DR: This work studies the effect of a hierarchy of recurrent neural networks on processing time series, and shows that they reach state-of-the-art performance for recurrent networks in character-level language modelling when trained with stochastic gradient descent.
Proceedings ArticleDOI

Online novelty detection on temporal sequences

TL;DR: A concrete online detection algorithm is developed, by modeling the temporal sequence using an online support vector regression algorithm, using a mechanism for associating each detection result with a confidence value.

How the brain might work: a hierarchical and temporal model for learning and recognition

TL;DR: Algorithms and networks that combine hierarchical and temporal learning with Bayesian inference for pattern recognition and a generative model for HTMs are developed, which enables the generation of synthetic data from HTM networks.
Related Papers (5)
Trending Questions (1)
What is short / long term memory task NRMSE?

The paper does not mention the NRMSE (Normalized Root Mean Squared Error) for short/long term memory tasks.