scispace - formally typeset
Open AccessProceedings Article

Inductive Representation Learning on Large Graphs

TLDR
GraphSAGE as mentioned in this paper is a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings instead of training individual embedding for each node.
Abstract
Low-dimensional embeddings of nodes in large graphs have proved extremely useful in a variety of prediction tasks, from content recommendation to identifying protein functions. However, most existing approaches require that all nodes in the graph are present during training of the embeddings; these previous approaches are inherently transductive and do not naturally generalize to unseen nodes. Here we present GraphSAGE, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings. Instead of training individual embeddings for each node, we learn a function that generates embeddings by sampling and aggregating features from a node's local neighborhood. Our algorithm outperforms strong baselines on three inductive node-classification benchmarks: we classify the category of unseen nodes in evolving information graphs based on citation and Reddit post data, and we show that our algorithm generalizes to completely unseen graphs using a multi-graph dataset of protein-protein interactions.

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted Content

How Powerful are Graph Neural Networks

TL;DR: This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
Posted Content

Revisiting Graph Neural Networks: All We Have is Low-Pass Filters

TL;DR: The results indicate that graph neural networks only perform low-pass filtering on feature vectors and do not have the non-linear manifold learning property, and some insights on GCN-based graph neural network design are proposed.
Posted Content

Predict then Propagate: Graph Neural Networks meet Personalized PageRank

TL;DR: In this article, the relationship between graph convolutional networks (GCN) and PageRank was used to derive an improved propagation scheme based on personalized PageRank, which leverages a large, adjustable neighborhood for classification and can be easily combined with any neural network.
Proceedings Article

PairNorm: Tackling Oversmoothing in GNNs

TL;DR: PairNorm as mentioned in this paper is a novel normalization layer that is based on a careful analysis of the graph convolution operator, which prevents all node embeddings from becoming too similar.
Journal ArticleDOI

Multiscale Dynamic Graph Convolutional Network for Hyperspectral Image Classification

TL;DR: The proposed multiscale dynamic GCN (MDGCN) enables the graph to be dynamically updated along with the graph convolution process so that these two steps can be benefited from each other to gradually produce the discriminative embedded features as well as a refined graph.
Related Papers (5)