scispace - formally typeset
Open AccessProceedings Article

Inductive Representation Learning on Large Graphs

TLDR
GraphSAGE as mentioned in this paper is a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings instead of training individual embedding for each node.
Abstract
Low-dimensional embeddings of nodes in large graphs have proved extremely useful in a variety of prediction tasks, from content recommendation to identifying protein functions. However, most existing approaches require that all nodes in the graph are present during training of the embeddings; these previous approaches are inherently transductive and do not naturally generalize to unseen nodes. Here we present GraphSAGE, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings. Instead of training individual embeddings for each node, we learn a function that generates embeddings by sampling and aggregating features from a node's local neighborhood. Our algorithm outperforms strong baselines on three inductive node-classification benchmarks: we classify the category of unseen nodes in evolving information graphs based on citation and Reddit post data, and we show that our algorithm generalizes to completely unseen graphs using a multi-graph dataset of protein-protein interactions.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Predicting Emergency Medical Service Demand With Bipartite Graph Convolutional Networks

TL;DR: In this paper, a bipartite graph convolutional neural network model was proposed to predict the EMS demand between hospital-region pairs, which achieved 77.3% and 87.7% accuracy in binary demand label prediction task.
Posted Content

Graph Generation with Variational Recurrent Neural Network

TL;DR: This paper introduces Graph Variational Recurrent Neural Network (GraphVRNN), a probabilistic autoregressive model for graph generation that can capture the joint distributions of graph structures and the underlying node attributes.
Posted Content

A Degeneracy Framework for Scalable Graph Autoencoders

TL;DR: This framework leverages graph degeneracy concepts to train models only from a dense subset of nodes instead of using the entire graph, significantly improves scalability and training speed while preserving performance.
Posted Content

SceneGraphNet: Neural Message Passing for 3D Indoor Scene Augmentation

TL;DR: In this paper, the authors propose a neural message passing approach to augment an input 3D indoor scene with new objects matching their surroundings, given an input, potentially incomplete, 3D scene and a query location, their method predicts a probability distribution over object types that fit well in that location.
Proceedings ArticleDOI

Event Time Extraction and Propagation via Graph Attention Networks

TL;DR: This paper first formulates this problem based on a 4-tuple temporal representation used in entity slot filling, which allows us to represent fuzzy time spans more conveniently, and proposes a graph attention network-based approach to propagate temporal information over document-level event graphs constructed by shared entity arguments and temporal relations.
Related Papers (5)