Open AccessProceedings Article
Inductive Representation Learning on Large Graphs
William L. Hamilton,Zhitao Ying,Jure Leskovec +2 more
- Vol. 30, pp 1024-1034
TLDR
GraphSAGE as mentioned in this paper is a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings instead of training individual embedding for each node.Abstract:
Low-dimensional embeddings of nodes in large graphs have proved extremely useful in a variety of prediction tasks, from content recommendation to identifying protein functions. However, most existing approaches require that all nodes in the graph are present during training of the embeddings; these previous approaches are inherently transductive and do not naturally generalize to unseen nodes. Here we present GraphSAGE, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings. Instead of training individual embeddings for each node, we learn a function that generates embeddings by sampling and aggregating features from a node's local neighborhood. Our algorithm outperforms strong baselines on three inductive node-classification benchmarks: we classify the category of unseen nodes in evolving information graphs based on citation and Reddit post data, and we show that our algorithm generalizes to completely unseen graphs using a multi-graph dataset of protein-protein interactions.read more
Citations
More filters
Journal ArticleDOI
t-PINE: tensor-based predictable and interpretable node embeddings
TL;DR: It is argued that the implicit and the explicit mapping from a higher-dimensional to a lower-dimensional vector space is the key to learn more useful, highly predictable, and gracefully interpretable representations.
Posted Content
Bitcoin Transaction Forecasting with Deep Network Representation Learning
Wenqi Wei,Qi Zhang,Ling Liu +2 more
TL;DR: DLForecast as mentioned in this paper uses deep neural networks for learning Bitcoin transaction network representations and develops a Bitcoin transaction forecasting system between user accounts based on historical transactions with built-in time-decaying factor.
Posted Content
Deep Demixing: Reconstructing the Evolution of Epidemics Using Graph Neural Networks.
TL;DR: This work proposes DDmix, a graph conditional variational autoencoder that can be trained from past epidemic spreads and whose latent space seeks to capture key aspects of the underlying (unknown) spreading dynamics.
Posted Content
Temporal Knowledge Graph Forecasting with Neural ODE
TL;DR: In this paper, a graph ODE model with graph neural networks (GNNs) is proposed for temporal knowledge graph forecasting, which achieves a continuous model in time and efficiently learns node representation for future prediction.
Posted Content
CopulaGNN: Towards Integrating Representational and Correlational Roles of Graphs in Graph Neural Networks
TL;DR: The proposed Copula Graph Neural Network (CopulaGNN) can take a wide range of GNN models as base models and utilize both representational and correlational information stored in the graphs.