Open AccessPosted Content
iPool -- Information-based Pooling in Hierarchical Graph Neural Networks
TLDR
A parameter-free pooling operator is proposed, called iPool, that permits to retain the most informative features in arbitrary graphs and achieves superior or competitive performance in graph classification on a collection of public graph benchmark data sets and superpixel-induced image graph data sets.Abstract:
With the advent of data science, the analysis of network or graph data has become a very timely research problem. A variety of recent works have been proposed to generalize neural networks to graphs, either from a spectral graph theory or a spatial perspective. The majority of these works however focus on adapting the convolution operator to graph representation. At the same time, the pooling operator also plays an important role in distilling multiscale and hierarchical representations but it has been mostly overlooked so far. In this paper, we propose a parameter-free pooling operator, called iPool, that permits to retain the most informative features in arbitrary graphs. With the argument that informative nodes dominantly characterize graph signals, we propose a criterion to evaluate the amount of information of each node given its neighbors, and theoretically demonstrate its relationship to neighborhood conditional entropy. This new criterion determines how nodes are selected and coarsened graphs are constructed in the pooling layer. The resulting hierarchical structure yields an effective isomorphism-invariant representation of networked data in arbitrary topologies. The proposed strategy is evaluated in terms of graph classification on a collection of public graph datasets, including bioinformatics and social networks, and achieves state-of-the-art performance on most of the datasets.read more
Citations
More filters
Posted Content
Hierarchical Graph Pooling with Structure Learning
TL;DR: A novel graph pooling operator, called Hierarchical Graph Pooling with Structure Learning (HGP-SL), which can be integrated into various graph neural network architectures, and introduces a structure learning mechanism to learn a refined graph structure for the pooled graph at each layer.
Journal ArticleDOI
Graph neural networks for materials science and chemistry
Patrick Reiser,Marlen Neubert,André Eberhard,Luca Torresi,Chen Zhou,Chen Shao,Houssam Metni,Clint van Hoesel,Henrik Schopmans,Timo Sommer,Pascal Friederich +10 more
TL;DR: Graph Neural Networks (GNNs) as mentioned in this paper are one of the fastest growing classes of machine learning models and play an increasingly important role in many areas of chemistry and materials science, being used to predict materials properties, accelerate simulations, design new structures, and predict synthesis routes of new materials.
Journal ArticleDOI
Graph neural networks in TensorFlow-Keras with RaggedTensor representation (kgcnn)
TL;DR: This technical report presents an implementation of graph convolution and graph pooling layers for TensorFlow-Keras models, which allows a seamless and flexible integration into standard Keras layers to set up graph models in a functional way.
Journal ArticleDOI
Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural Networks
TL;DR: A comprehensive graph gradual pruning framework termed CGP is proposed, designed by designing a during-training graph pruning paradigm to dynamically prune GNNs within one training process, which requires no re-training and significantly reduces the computation costs.
Book ChapterDOI
Masked Graph Auto-Encoder Constrained Graph Pooling
TL;DR: This work proposes a novel and accessible technique called Masked Graph Auto-encoder constrained Pooling (MGAP), which enables vanilla node drop pooling methods to retain sufficient effective graph information from both node-attribute and network-topology perspectives.
References
More filters
Posted Content
Semi-Supervised Classification with Graph Convolutional Networks
Thomas Kipf,Max Welling +1 more
TL;DR: A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
Automatic differentiation in PyTorch
Adam Paszke,Sam Gross,Soumith Chintala,Gregory Chanan,Edward Z. Yang,Zachary DeVito,Zeming Lin,Alban Desmaison,Luca Antiga,Adam Lerer +9 more
TL;DR: An automatic differentiation module of PyTorch is described — a library designed to enable rapid research on machine learning models that focuses on differentiation of purely imperative programs, with a focus on extensibility and low overhead.
Journal ArticleDOI
A tutorial on spectral clustering
TL;DR: In this article, the authors present the most common spectral clustering algorithms, and derive those algorithms from scratch by several different approaches, and discuss the advantages and disadvantages of these algorithms.
Posted Content
Inductive Representation Learning on Large Graphs
TL;DR: GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Proceedings ArticleDOI
Graph Attention Networks
TL;DR: Graph Attention Networks (GATs) as mentioned in this paper leverage masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations.