Open AccessPosted Content
Temporal Graph Networks for Deep Learning on Dynamic Graphs.
Emanuele Rossi,Ben Chamberlain,Fabrizio Frasca,Davide Eynard,Federico Monti,Michael M. Bronstein +5 more
Reads0
Chats0
TLDR
This paper presents Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events that significantly outperform previous approaches being at the same time more computationally efficient.Abstract:
Graph Neural Networks (GNNs) have recently become increasingly popular due to their ability to learn complex systems of relations or interactions arising in a broad spectrum of problems ranging from biology and particle physics to social networks and recommendation systems Despite the plethora of different models for deep learning on graphs, few approaches have been proposed thus far for dealing with graphs that present some sort of dynamic nature (eg evolving features or connectivity over time) In this paper, we present Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events Thanks to a novel combination of memory modules and graph-based operators, TGNs are able to significantly outperform previous approaches being at the same time more computationally efficient We furthermore show that several previous models for learning on dynamic graphs can be cast as specific instances of our framework We perform a detailed ablation study of different components of our framework and devise the best configuration that achieves state-of-the-art performance on several transductive and inductive prediction tasks for dynamic graphsread more
Citations
More filters
Journal ArticleDOI
Foundations and Modeling of Dynamic Networks Using Dynamic Graph Neural Networks: A Survey
TL;DR: This work establishes a foundation of dynamic networks with consistent, detailed terminology and notation and presents a comprehensive survey of dynamic graph neural network models using the proposed terminology.
Posted ContentDOI
ETA Prediction with Graph Neural Networks in Google Maps
Austin Derrow-Pinion,Jennifer She,David Wong,Oliver Fritz Lange,Todd Hester,Luis Perez,Marc Nunkesser,Seongjae Lee,Xueying Guo,Brett Wiltshire,Peter W. Battaglia,Vishal Gupta,Ang Li,Zhongwen Xu,Alvaro Sanchez-Gonzalez,Yujia Li,Petar Veličković +16 more
TL;DR: In this article, a graph neural network estimator for estimated time of arrival (ETA) is presented, which has been deployed in production at Google Maps and has shown promising results.
Posted Content
Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges
TL;DR: In this paper, the authors propose a unified geometric unification of high-dimensional regularities through unified geometric principles that can be applied throughout a wide spectrum of applications, such as computer vision, playing Go, or protein folding.
Proceedings ArticleDOI
APAN: Asynchronous Propagation Attention Network for Real-time Temporal Graph Embedding
Xuhong Wang,Ding Lyu,Mengjian Li,Yang Xia,Qi Yang,Xinwen Wang,Xinguang Wang,Ping Cui,Yupu Yang,Bowen Sun,Zhenyu Guo +10 more
TL;DR: This work proposes Asynchronous Propagation Attention Network, an asynchronous continuous time dynamic graph algorithm for real-time temporal graph embedding that decouple model inference and graph computation to alleviate the damage of the heavy graph query operation to the speed of model inference.
References
More filters
Journal ArticleDOI
Long short-term memory
TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings Article
Attention is All you Need
Ashish Vaswani,Noam Shazeer,Niki Parmar,Jakob Uszkoreit,Llion Jones,Aidan N. Gomez,Lukasz Kaiser,Illia Polosukhin +7 more
TL;DR: This paper proposed a simple network architecture based solely on an attention mechanism, dispensing with recurrence and convolutions entirely and achieved state-of-the-art performance on English-to-French translation.
Proceedings ArticleDOI
Learning Phrase Representations using RNN Encoder--Decoder for Statistical Machine Translation
Kyunghyun Cho,Bart van Merriënboer,Caglar Gulcehre,Dzmitry Bahdanau,Fethi Bougares,Holger Schwenk,Yoshua Bengio,Yoshua Bengio,Yoshua Bengio +8 more
TL;DR: In this paper, the encoder and decoder of the RNN Encoder-Decoder model are jointly trained to maximize the conditional probability of a target sequence given a source sequence.
Posted Content
Semi-Supervised Classification with Graph Convolutional Networks
Thomas Kipf,Max Welling +1 more
TL;DR: A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
Proceedings ArticleDOI
DeepWalk: online learning of social representations
TL;DR: DeepWalk as mentioned in this paper uses local information obtained from truncated random walks to learn latent representations by treating walks as the equivalent of sentences, which encode social relations in a continuous vector space, which is easily exploited by statistical models.