scispace - formally typeset
Journal ArticleDOI

Deep learning model for home automation and energy reduction in a smart home environment platform

TLDR
A modular platform that uses the power of cloud services to collect, aggregate and store all the data gathered from the smart environment and uses the data to generate advanced neural network models to create energy awareness.
Abstract
The target of smart houses and enhanced living environments is to increase the quality of life further In this context, more supporting platforms for smart houses were developed, some of them using cloud systems for remote supervision, control and data storage An important aspect, which is an open issue for both industry and academia, is represented by how to reduce and estimate energy consumption for a smart house In this paper, we propose a modular platform that uses the power of cloud services to collect, aggregate and store all the data gathered from the smart environment Then, we use the data to generate advanced neural network models to create energy awareness by advising the smart environment occupants on how they can improve daily habits while reducing the energy consumption and thus also the costs

read more

Citations
More filters
Journal ArticleDOI

IoT in Smart Cities: A Survey of Technologies, Practices and Challenges

TL;DR: This paper provides a holistic coverage of the Internet of Things in Smart Cities by discussing the fundamental components that make up the IoT based Smart City landscape followed by the technologies that enable these domains to exist in terms of architectures utilized, networking technologies used as well as the Artificial Algorithms deployed in IoTbased Smart City systems.
Journal ArticleDOI

Leveraging Deep Learning and IoT big data analytics to support the smart cities development: Review and future directions

TL;DR: This survey provides a review of the literature regarding the use of IoT and DL to develop smart cities and outlines the current challenges and issues faced during the development of smart city services.
Journal ArticleDOI

Review on Deep Neural Networks Applied to Low-Frequency NILM

TL;DR: This paper reviews non-intrusive load monitoring approaches that employ deep neural networks to disaggregate appliances from low frequency data, i.e., data with sampling rates lower than the AC base frequency and does a performance comparison with respect to reported mean absolute error (MAE) and F1-scores.
Journal ArticleDOI

Energy-efficient heating control for smart buildings with deep reinforcement learning

TL;DR: This research presents a Deep Reinforcement Learning (DRL)-based heating controller to improve thermal comfort and minimize energy costs in smart buildings and observes that as the number of buildings and differences in their setpoint temperatures increase, decentralized control performs better than a centralized controller.
Journal ArticleDOI

Moving Deep Learning to the Edge

TL;DR: This paper reviews the main research directions for edge computing deep learning algorithms and suggests new resource and energy-oriented deep learning models, as well as new computing platforms.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Posted Content

Caffe: Convolutional Architecture for Fast Feature Embedding

TL;DR: Caffe as discussed by the authors is a BSD-licensed C++ library with Python and MATLAB bindings for training and deploying general-purpose convolutional neural networks and other deep models efficiently on commodity architectures.
Proceedings Article

Sequence to Sequence Learning with Neural Networks

TL;DR: The authors used a multilayered Long Short-Term Memory (LSTM) to map the input sequence to a vector of a fixed dimensionality, and then another deep LSTM to decode the target sequence from the vector.
Posted Content

Sequence to Sequence Learning with Neural Networks

TL;DR: This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.
Proceedings ArticleDOI

Caffe: Convolutional Architecture for Fast Feature Embedding

TL;DR: Caffe provides multimedia scientists and practitioners with a clean and modifiable framework for state-of-the-art deep learning algorithms and a collection of reference models for training and deploying general-purpose convolutional neural networks and other deep models efficiently on commodity architectures.
Related Papers (5)