Open AccessProceedings Article
Inductive Representation Learning on Large Graphs
William L. Hamilton,Zhitao Ying,Jure Leskovec +2 more
- Vol. 30, pp 1024-1034
TLDR
GraphSAGE as mentioned in this paper is a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings instead of training individual embedding for each node.Abstract:
Low-dimensional embeddings of nodes in large graphs have proved extremely useful in a variety of prediction tasks, from content recommendation to identifying protein functions. However, most existing approaches require that all nodes in the graph are present during training of the embeddings; these previous approaches are inherently transductive and do not naturally generalize to unseen nodes. Here we present GraphSAGE, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings. Instead of training individual embeddings for each node, we learn a function that generates embeddings by sampling and aggregating features from a node's local neighborhood. Our algorithm outperforms strong baselines on three inductive node-classification benchmarks: we classify the category of unseen nodes in evolving information graphs based on citation and Reddit post data, and we show that our algorithm generalizes to completely unseen graphs using a multi-graph dataset of protein-protein interactions.read more
Citations
More filters
Proceedings Article
Adaptive Structural Fingerprints for Graph Attention Networks
TL;DR: The ADSF model provides a useful platform for different subspaces of node features and various scales of graph structures to ``cross-talk'' with each other through the learning of multi-head attention, being particularly useful in handling complex real-world data.
Posted Content
Marginalized Average Attentional Network for Weakly-Supervised Learning
TL;DR: Theoretically, it is proved that the MAA module with learned latent discriminative probabilities successfully reduces the difference in responses between the most salient regions and the others, and therefore, MAAN is able to generate better class activation sequences and identify dense and integral action regions in the videos.
Proceedings ArticleDOI
L2-GCN: Layer-Wise and Learned Efficient Training of Graph Convolutional Networks
TL;DR: A novel efficient layer-wise training framework for GCN (L-GCN), that disentangles feature aggregation and feature transformation during training, hence greatly reducing time and memory complexities, and is faster than state-of-the-arts by at least an order of magnitude.
Journal ArticleDOI
A novel link prediction algorithm for protein-protein interaction networks by attributed graph embedding.
TL;DR: Wang et al. as discussed by the authors presented a modified version of Deepwalk based on feature selection for solving link prediction in the protein-protein interaction, which will benefit both network structure and protein features.
Posted Content
GraphFL: A Federated Learning Framework for Semi-Supervised Node Classification on Graphs.
TL;DR: This work proposes the first FL framework, namely GraphFL, for semi-supervised node classification on graphs and proposes two GraphFL methods to respectively address the non-IID issue in graph data and handle the tasks with new label domains, and designs a self-training method to leverage unlabeled graph data.