scispace - formally typeset
Open AccessPosted Content

Temporal Graph Networks for Deep Learning on Dynamic Graphs.

Reads0
Chats0
TLDR
This paper presents Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events that significantly outperform previous approaches being at the same time more computationally efficient.
Abstract
Graph Neural Networks (GNNs) have recently become increasingly popular due to their ability to learn complex systems of relations or interactions arising in a broad spectrum of problems ranging from biology and particle physics to social networks and recommendation systems Despite the plethora of different models for deep learning on graphs, few approaches have been proposed thus far for dealing with graphs that present some sort of dynamic nature (eg evolving features or connectivity over time) In this paper, we present Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events Thanks to a novel combination of memory modules and graph-based operators, TGNs are able to significantly outperform previous approaches being at the same time more computationally efficient We furthermore show that several previous models for learning on dynamic graphs can be cast as specific instances of our framework We perform a detailed ablation study of different components of our framework and devise the best configuration that achieves state-of-the-art performance on several transductive and inductive prediction tasks for dynamic graphs

read more

Citations
More filters
Journal ArticleDOI

Foundations and Modeling of Dynamic Networks Using Dynamic Graph Neural Networks: A Survey

TL;DR: This work establishes a foundation of dynamic networks with consistent, detailed terminology and notation and presents a comprehensive survey of dynamic graph neural network models using the proposed terminology.
Posted ContentDOI

ETA Prediction with Graph Neural Networks in Google Maps

TL;DR: In this article, a graph neural network estimator for estimated time of arrival (ETA) is presented, which has been deployed in production at Google Maps and has shown promising results.
Posted Content

Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges

TL;DR: In this paper, the authors propose a unified geometric unification of high-dimensional regularities through unified geometric principles that can be applied throughout a wide spectrum of applications, such as computer vision, playing Go, or protein folding.
Proceedings ArticleDOI

APAN: Asynchronous Propagation Attention Network for Real-time Temporal Graph Embedding

TL;DR: This work proposes Asynchronous Propagation Attention Network, an asynchronous continuous time dynamic graph algorithm for real-time temporal graph embedding that decouple model inference and graph computation to alleviate the damage of the heavy graph query operation to the speed of model inference.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings Article

Attention is All you Need

TL;DR: This paper proposed a simple network architecture based solely on an attention mechanism, dispensing with recurrence and convolutions entirely and achieved state-of-the-art performance on English-to-French translation.
Proceedings ArticleDOI

Learning Phrase Representations using RNN Encoder--Decoder for Statistical Machine Translation

TL;DR: In this paper, the encoder and decoder of the RNN Encoder-Decoder model are jointly trained to maximize the conditional probability of a target sequence given a source sequence.
Posted Content

Semi-Supervised Classification with Graph Convolutional Networks

TL;DR: A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
Proceedings ArticleDOI

DeepWalk: online learning of social representations

TL;DR: DeepWalk as mentioned in this paper uses local information obtained from truncated random walks to learn latent representations by treating walks as the equivalent of sentences, which encode social relations in a continuous vector space, which is easily exploited by statistical models.
Related Papers (5)