scispace - formally typeset
Journal ArticleDOI

LSTM network: a deep learning approach for short-term traffic forecast

TLDR
A novel traffic forecast model based on long short-term memory (LSTM) network is proposed, which considers temporal-spatial correlation in traffic system via a two-dimensional network which is composed of many memory units.
Abstract
Short-term traffic forecast is one of the essential issues in intelligent transportation system. Accurate forecast result enables commuters make appropriate travel modes, travel routes, and departure time, which is meaningful in traffic management. To promote the forecast accuracy, a feasible way is to develop a more effective approach for traffic data analysis. The availability of abundant traffic data and computation power emerge in recent years, which motivates us to improve the accuracy of short-term traffic forecast via deep learning approaches. A novel traffic forecast model based on long short-term memory (LSTM) network is proposed. Different from conventional forecast models, the proposed LSTM network considers temporal-spatial correlation in traffic system via a two-dimensional network which is composed of many memory units. A comparison with other representative forecast models validates that the proposed LSTM network can achieve a better performance.

read more

Citations
More filters
Journal ArticleDOI

Traffic Graph Convolutional Recurrent Neural Network: A Deep Learning Framework for Network-Scale Traffic Learning and Forecasting

TL;DR: A novel deep learning framework, Traffic Graph Convolutional Long Short-Term Memory Neural Network (TGC-LSTM), to learn the interactions between roadways in the traffic network and forecast the network-wide traffic state and shows that the proposed model outperforms baseline methods on two real-world traffic state datasets.
Journal ArticleDOI

Hourly day-ahead solar irradiance prediction using weather forecasts by LSTM

TL;DR: A novel solar prediction scheme for hourly day-ahead solar irradiance prediction by using the weather forecasting data is proposed and it is demonstrated that the proposed algorithm outperforms these competitive algorithms for single output prediction.
Journal ArticleDOI

Short-term forecasting of passenger demand under on-demand ride services: A spatio-temporal deep learning approach

TL;DR: This paper is one of the first DL studies to forecast the short-term passenger demand of an on-demand ride service platform by examining the spatio-temporal correlations and the FCL-Net achieves the better predictive performance than traditional approaches.
Journal ArticleDOI

Spatiotemporal Recurrent Convolutional Networks for Traffic Prediction in Transportation Networks.

TL;DR: Wang et al. as mentioned in this paper proposed a spatiotemporal recurrent convolutional networks (SRCNs) for traffic forecasting, which inherit the advantages of deep CNNs and LSTM neural networks.
Posted Content

Spatiotemporal Recurrent Convolutional Networks for Traffic Prediction in Transportation Networks

TL;DR: A network grid representation method that can retain the fine-scale structure of a transportation network and outperform other deep learning-based algorithms in both short-term and long-term traffic prediction is proposed.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

A fast learning algorithm for deep belief nets

TL;DR: A fast, greedy algorithm is derived that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory.
Proceedings ArticleDOI

Speech recognition with deep recurrent neural networks

TL;DR: This paper investigates deep recurrent neural networks, which combine the multiple levels of representation that have proved so effective in deep networks with the flexible use of long range context that empowers RNNs.
Posted Content

Speech Recognition with Deep Recurrent Neural Networks

TL;DR: In this paper, deep recurrent neural networks (RNNs) are used to combine the multiple levels of representation that have proved so effective in deep networks with the flexible use of long range context that empowers RNNs.
Proceedings Article

Greedy Layer-Wise Training of Deep Networks

TL;DR: These experiments confirm the hypothesis that the greedy layer-wise unsupervised training strategy mostly helps the optimization, by initializing weights in a region near a good local minimum, giving rise to internal distributed representations that are high-level abstractions of the input, bringing better generalization.
Related Papers (5)