Journal ArticleDOI
Accurate freeway travel time prediction with state-space neural networks under missing data
TLDR
This article proposes a freeway travel time prediction framework that exploits a recurrent neural network topology, the so-called state-space neural network (SSNN), with preprocessing strategies based on imputation that appears to be robust to the “damage” done by these imputation schemes.Abstract:
Accuracy and robustness with respect to missing or corrupt input data are two key characteristics for any travel time prediction model that is to be applied in a real-time environment (e.g. for display on variable message signs on freeways). This article proposes a freeway travel time prediction framework that exhibits both qualities. The framework exploits a recurrent neural network topology, the so-called state-space neural network (SSNN), with preprocessing strategies based on imputation. Although the SSNN model is a neural network, its design (in terms of input- and model selection) is not “black box” nor location-specific. Instead, it is based on the lay-out of the freeway stretch of interest. In this sense, the SSNN model combines the generality of neural network approaches, with traffic related (“white-box”) design. Robustness to missing data is tackled by means of simple imputation (data replacement) schemes, such as exponential forecasts and spatial interpolation. Although there are clear theoretical shortcomings to “simple” imputation schemes to remedy input failure, our results indicate that their use is justified in this particular application. The SSNN model appears to be robust to the “damage” done by these imputation schemes. This is true for both incidental (random) and structural input failure. We demonstrate that the SSNN travel time prediction framework yields good accurate and robust travel time predictions on both synthetic and real data.read more
Citations
More filters
Journal ArticleDOI
Long short-term memory neural network for traffic speed prediction using remote microwave sensor data
TL;DR: A comparison with different topologies of dynamic neural networks as well as other prevailing parametric and nonparametric algorithms suggests that LSTM NN can achieve the best prediction performance in terms of both accuracy and stability.
Journal ArticleDOI
Data-Driven Intelligent Transportation Systems: A Survey
TL;DR: A survey on the development of D2ITS is provided, discussing the functionality of its key components and some deployment issues associated with D2 ITS Future research directions for the developed system are presented.
Journal ArticleDOI
Short-term traffic forecasting: Where we are and where we’re going
TL;DR: In this article, the authors present a review of the existing literature on short-term traffic forecasting and offer suggestions for future work, focusing on 10 challenging, yet relatively under researched, directions.
Journal ArticleDOI
Deep learning for short-term traffic flow prediction
Nicholas G. Polson,Vadim Sokolov +1 more
TL;DR: A deep learning model is developed that combines a linear model that is fitted using l 1 regularization and a sequence of tanh layers to predict traffic flows and identifies spatio-temporal relations among predictors and other layers model nonlinear relations.
Journal ArticleDOI
Big Data Analytics in Intelligent Transportation Systems: A Survey
TL;DR: Several case studies of big data analytics applications in intelligent transportation systems, including road traffic accidents analysis, road traffic flow prediction, public transportation service plan, personal travel route plan, rail transportation management and control, and assets maintenance are introduced.
References
More filters
Journal ArticleDOI
Long short-term memory
TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Book
Neural networks for pattern recognition
TL;DR: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition, and is designed as a text, with over 100 exercises, to benefit anyone involved in the fields of neural computation and pattern recognition.
Book ChapterDOI
Neural Networks for Pattern Recognition
Suresh Kothari,Heekuck Oh +1 more
TL;DR: The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue.
Journal ArticleDOI
Finding Structure in Time
TL;DR: A proposal along these lines first described by Jordan (1986) which involves the use of recurrent links in order to provide networks with a dynamic memory and suggests a method for representing lexical categories and the type/token distinction is developed.
Journal ArticleDOI
Training feedforward networks with the Marquardt algorithm
TL;DR: The Marquardt algorithm for nonlinear least squares is presented and is incorporated into the backpropagation algorithm for training feedforward neural networks and is found to be much more efficient than either of the other techniques when the network contains no more than a few hundred weights.