scispace - formally typeset
Open AccessPosted Content

Deep Graph Structure Learning for Robust Representations: A Survey

TLDR
In this article, the authors broadly review recent progress of Graph Structure Learning (GSL) methods for learning robust representations and point out some issues in current studies and discuss future directions.
Abstract
Graph Neural Networks (GNNs) are widely used for analyzing graph-structured data. Most GNN methods are highly sensitive to the quality of graph structures and usually require a perfect graph structure for learning informative embeddings. However, the pervasiveness of noise in graphs necessitates learning robust representations for real-world problems. To improve the robustness of GNN models, many studies have been proposed around the central concept of Graph Structure Learning (GSL), which aims to jointly learn an optimized graph structure and corresponding representations. Towards this end, in the presented survey, we broadly review recent progress of GSL methods for learning robust representations. Specifically, we first formulate a general paradigm of GSL, and then review state-of-the-art methods classified by how they model graph structures, followed by applications that incorporate the idea of GSL in other graph tasks. Finally, we point out some issues in current studies and discuss future directions.

read more

Citations
More filters
Proceedings ArticleDOI

Sequential Recommendation with Graph Neural Networks

TL;DR: Wang et al. as mentioned in this paper proposed a graph neural network model called SURGE (short forSeqUential Recommendation with Graph neural nEtworks) to address two main challenges in sequential recommendation.
Posted ContentDOI

Attention-driven Graph Clustering Network

TL;DR: Zhang et al. as discussed by the authors proposed an attention-driven graph clustering network (AGCN), which exploits a heterogeneity-wise fusion module to dynamically fuse the node attribute feature and the topological graph feature.
Posted Content

SLAPS: Self-Supervision Improves Structure Learning for Graph Neural Networks.

TL;DR: In this paper, the authors propose the Simultaneous Learning of Adjacency and GNN Parameters with Self-Supervision (SLAPS) method, which provides more supervision for inferring a graph structure through self-supervision.
Posted Content

An Empirical Study of Graph Contrastive Learning

TL;DR: In this paper, the authors identify several critical design considerations within a general GCL paradigm, including augmentation functions, contrasting modes, contrastive objectives, and negative mining techniques, and conduct extensive, controlled experiments over a set of benchmark tasks on datasets across various domains.
Posted Content

Learnt Sparsification for Interpretable Graph Neural Networks.

TL;DR: Kedge as discussed by the authors learns edge masks in a modular fashion trained with any GNN allowing for gradient based optimization in an end-to-end fashion, which can prune a large proportion of the edges with only a minor effect on the test accuracy.
References
More filters
Proceedings Article

Auto-Encoding Variational Bayes

TL;DR: A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Proceedings Article

Intriguing properties of neural networks

TL;DR: It is found that there is no distinction between individual highlevel units and random linear combinations of high level units, according to various methods of unit analysis, and it is suggested that it is the space, rather than the individual units, that contains of the semantic information in the high layers of neural networks.
Proceedings ArticleDOI

Graph Attention Networks

TL;DR: Graph Attention Networks (GATs) as mentioned in this paper leverage masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations.
Journal ArticleDOI

Dynamic Graph CNN for Learning on Point Clouds

TL;DR: This work proposes a new neural network module suitable for CNN-based high-level tasks on point clouds, including classification and segmentation called EdgeConv, which acts on graphs dynamically computed in each layer of the network.
Journal ArticleDOI

Variational Inference: A Review for Statisticians

TL;DR: For instance, mean-field variational inference as discussed by the authors approximates probability densities through optimization, which is used in many applications and tends to be faster than classical methods, such as Markov chain Monte Carlo sampling.
Related Papers (5)