scispace - formally typeset
Open AccessProceedings Article

Inductive Representation Learning on Large Graphs

TLDR
GraphSAGE as mentioned in this paper is a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings instead of training individual embedding for each node.
Abstract
Low-dimensional embeddings of nodes in large graphs have proved extremely useful in a variety of prediction tasks, from content recommendation to identifying protein functions. However, most existing approaches require that all nodes in the graph are present during training of the embeddings; these previous approaches are inherently transductive and do not naturally generalize to unseen nodes. Here we present GraphSAGE, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings. Instead of training individual embeddings for each node, we learn a function that generates embeddings by sampling and aggregating features from a node's local neighborhood. Our algorithm outperforms strong baselines on three inductive node-classification benchmarks: we classify the category of unseen nodes in evolving information graphs based on citation and Reddit post data, and we show that our algorithm generalizes to completely unseen graphs using a multi-graph dataset of protein-protein interactions.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI

High Performance Graph ConvolutionaI Networks with Applications in Testability Analysis

TL;DR: Experimental results show the proposed GCN model has superior accuracy to classical machine learning models on difficult-to-observation nodes prediction, and compared with commercial testability analysis tools, the proposed observation point insertion flow achieves similar fault coverage.
Posted Content

GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training

TL;DR: A principled normalization method, Graph Normalization (GraphNorm), where the key idea is to normalize the feature values across all nodes for each individual graph with a learnable shift, which improves generalization of GNNs, achieving better performance on graph classification benchmarks.
Posted Content

Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges

TL;DR: In this paper, the authors propose a unified geometric unification of high-dimensional regularities through unified geometric principles that can be applied throughout a wide spectrum of applications, such as computer vision, playing Go, or protein folding.
Proceedings ArticleDOI

GCNAX: A Flexible and Energy-efficient Accelerator for Graph Convolutional Neural Networks

TL;DR: This paper proposes a flexible and optimized dataflow for GCNs that simultaneously improves resource utilization and reduces data movement, and introduces a novel accelerator architecture called GCNAX, which tailors the compute engine, buffer structure and size based on the proposed dataflow.
Posted Content

PairNorm: Tackling Oversmoothing in GNNs

TL;DR: PairNorm is a novel normalization layer that is based on a careful analysis of the graph convolution operator, which prevents all node embeddings from becoming too similar and significantly boosts performance for a new problem setting that benefits from deeper GNNs.
Related Papers (5)