scispace - formally typeset
Journal ArticleDOI

Context Aware Energy Disaggregation Using Adaptive Bidirectional LSTM Models

Reads0
Chats0
TLDR
A non-causal adaptive context-aware bidirectional deep learning model for energy disaggregation that harnesses the representational power of deep recurrent Long Short-Term Memory neural networks, while fitting two basic properties of NILM problem which state of the art methods do not appropriately account for.
Abstract
Energy disaggregation, or Non-Intrusive Load Monitoring (NILM), describes various processes aiming to identify the individual contribution of appliances, given the aggregate power signal. In this paper, a non-causal adaptive context-aware bidirectional deep learning model for energy disaggregation is introduced. The proposed model, CoBiLSTM, harnesses the representational power of deep recurrent Long Short-Term Memory (LSTM) neural networks, while fitting two basic properties of NILM problem which state of the art methods do not appropriately account for: non-causality and adaptivity to contextual factors (e.g., seasonality). A Bayesian-optimized framework is introduced to select the best configuration of the proposed regression model, driven by a self-training adaptive mechanism. Furthermore, the proposed model is structured in a modular way to address multi-dimensionality issues that arise when the number of appliances increases. Experimental results indicate the proposed method’s superiority compared to the current state of the art.

read more

Citations
More filters
Journal ArticleDOI

Toward Load Identification Based on the Hilbert Transform and Sequence to Sequence Long Short-Term Memory

TL;DR: A novel method, HT-LSTM (Hilbert Transform Long Short-Term Memory), which enhances recognition of the various load types that contain the difference in the transient time and the transient shape of any load signal.
Journal ArticleDOI

NILM Applications: Literature review of learning approaches, recent developments and challenges

TL;DR: In this article , the authors present a critical approach to the nonintrusive load monitoring (NILM) problem, by thoroughly reviewing the experimental framework of both legacy and state-of-the-art studies.
Journal ArticleDOI

Deep learning for pattern recognition of photovoltaic energy generation

TL;DR: The discriminative deep models including autoencoders, Long Short-Term Memory networks, and Rectified Linear Units are introduced as a class of mathematical models that directly estimate the future solar energy given historical measurements to demonstrate the merit of spatiotemporal pattern recognition in PV generation prediction accuracy.
Journal ArticleDOI

Non-Intrusive Load Monitoring: A Review

TL;DR: In this paper , a generalized up-to-date review of NILM approaches including a high-level taxonomy of the different methods is provided, and previously published results are grouped based on the experimental setup which allows direct comparison.
Journal ArticleDOI

Fully-Convolutional Denoising Auto-Encoders for NILM in Large Non-Residential Buildings

TL;DR: In this paper, a fully convolutional denoising auto-encoder architecture (FCN-dAE) is proposed for large non-residential buildings, and it is compared, in terms of particular aspects of large buildings, to previous Denoising Auto-Encoder approaches (dAE).
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

Gradient-based learning applied to document recognition

TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Posted Content

Generating Sequences With Recurrent Neural Networks

TL;DR: This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time.
Journal ArticleDOI

Learning to Forget: Continual Prediction with LSTM

TL;DR: This work identifies a weakness of LSTM networks processing continual input streams that are not a priori segmented into subsequences with explicitly marked ends at which the network's internal state could be reset, and proposes a novel, adaptive forget gate that enables an LSTm cell to learn to reset itself at appropriate times, thus releasing internal resources.
Proceedings ArticleDOI

Learning to forget: continual prediction with LSTM

TL;DR: This work identifies a weakness of LSTM networks processing continual input streams without explicitly marked sequence ends and proposes an adaptive "forget gate" that enables an L STM cell to learn to reset itself at appropriate times, thus releasing internal resources.
Related Papers (5)