scispace - formally typeset
Open AccessJournal ArticleDOI

How to Build a Graph-Based Deep Learning Architecture in Traffic Domain: A Survey

Reads0
Chats0
TLDR
This survey carefully examines various graph-based deep learning architectures in many traffic applications to discuss their shared deep learning techniques, clarifying the utilization of each technique in traffic tasks.
Abstract
In recent years, various deep learning architectures have been proposed to solve complex challenges (e.g. spatial dependency, temporal dependency) in traffic domain, which have achieved satisfactory performance. These architectures are composed of multiple deep learning techniques in order to tackle various challenges in traffic tasks. Traditionally, convolution neural networks (CNNs) are utilized to model spatial dependency by decomposing the traffic network as grids. However, many traffic networks are graph-structured in nature. In order to utilize such spatial information fully, it's more appropriate to formulate traffic networks as graphs mathematically. Recently, various novel deep learning techniques have been developed to process graph data, called graph neural networks (GNNs). More and more works combine GNNs with other deep learning techniques to construct an architecture dealing with various challenges in a complex traffic task, where GNNs are responsible for extracting spatial correlations in traffic network. These graph-based architectures have achieved state-of-the-art performance. To provide a comprehensive and clear picture of such emerging trend, this survey carefully examines various graph-based deep learning architectures in many traffic applications. We first give guidelines to formulate a traffic problem based on graph and construct graphs from various kinds of traffic datasets. Then we decompose these graph-based architectures to discuss their shared deep learning techniques, clarifying the utilization of each technique in traffic tasks. What's more, we summarize some common traffic challenges and the corresponding graph-based deep learning solutions to each challenge. Finally, we provide benchmark datasets, open source codes and future research directions in this rapidly growing field.

read more

Citations
More filters
Posted Content

Adaptive Graph Convolutional Recurrent Network for Traffic Forecasting

TL;DR: It is argued that learning node-specific patterns is essential for traffic forecasting while the pre-defined graph is avoidable, and two adaptive modules for enhancing Graph Convolutional Network (GCN) with new capabilities are proposed.
Posted ContentDOI

Deep Learning on Traffic Prediction: Methods, Analysis and Future Directions

TL;DR: A comprehensive survey on deep learning-based approaches in traffic prediction from multiple perspectives is provided, and the state-of-the-art approaches in different traffic prediction applications are listed.
Posted Content

Graph Neural Network for Traffic Forecasting: A Survey.

TL;DR: In this paper, the authors present a comprehensive survey of graph neural networks for traffic forecasting problems, including graph convolutional and graph attention networks, and a comprehensive list of open data and source resources.
Journal ArticleDOI

Deep reinforcement learning for transportation network combinatorial optimization: A survey

TL;DR: In this paper, the authors presented an overview on how to combine deep reinforcement learning for combinatorial optimization problems, emphasizing general optimization problems as data points and exploring the relevant distribution of data used for learning in a given task.
Journal ArticleDOI

Short-Term Traffic Prediction With Deep Neural Networks: A Survey

TL;DR: In this article, the authors survey recent short-term traffic prediction (STTP) studies using deep neural networks from four perspectives: input data representation methods according to the number and type of spatial and temporal dependencies involved, application area, dataset and code availability, and the type of the represented spatiotemporal dependencies.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

Generative Adversarial Nets

TL;DR: A new framework for estimating generative models via an adversarial process, in which two models are simultaneously train: a generative model G that captures the data distribution and a discriminative model D that estimates the probability that a sample came from the training data rather than G.
Proceedings Article

Neural Machine Translation by Jointly Learning to Align and Translate

TL;DR: It is conjecture that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture, and it is proposed to extend this by allowing a model to automatically (soft-)search for parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly.
Posted Content

Semi-Supervised Classification with Graph Convolutional Networks

TL;DR: A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
Posted Content

Neural Machine Translation by Jointly Learning to Align and Translate

TL;DR: In this paper, the authors propose to use a soft-searching model to find the parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly.
Related Papers (5)