scispace - formally typeset
Open AccessPosted Content

CAGNN: Cluster-Aware Graph Neural Networks for Unsupervised Graph Representation Learning.

TLDR
A novel cluster-aware graph neural network (CAGNN) model for unsupervised graph representation learning using self-supervised techniques, which gains over 7% improvements in terms of accuracy on node clustering over state-of-the-arts.
Abstract
Unsupervised graph representation learning aims to learn low-dimensional node embeddings without supervision while preserving graph topological structures and node attributive features. Previous graph neural networks (GNN) require a large number of labeled nodes, which may not be accessible in real-world graph data. In this paper, we present a novel cluster-aware graph neural network (CAGNN) model for unsupervised graph representation learning using self-supervised techniques. In CAGNN, we perform clustering on the node embeddings and update the model parameters by predicting the cluster assignments. Moreover, we observe that graphs often contain inter-class edges, which mislead the GNN model to aggregate noisy information from neighborhood nodes. We further refine the graph topology by strengthening intra-class edges and reducing node connections between different classes based on cluster labels, which better preserves cluster structures in the embedding space. We conduct comprehensive experiments on two benchmark tasks using real-world datasets. The results demonstrate the superior performance of the proposed model over existing baseline methods. Notably, our model gains over 7% improvements in terms of accuracy on node clustering over state-of-the-arts.

read more

Citations
More filters
Posted Content

Graph Self-Supervised Learning: A Survey

TL;DR: In this paper, the authors present a comprehensive review of graph self-supervised learning (SSL) techniques for graph data and present a unified framework that mathematically formalizes the paradigm of graph SSL.
Posted Content

Graph Neural Networks: Taxonomy, Advances and Trends.

Yu Zhou, +2 more
- 16 Dec 2020 - 
TL;DR: A novel taxonomy for the graph neural networks is provided, and up to 327 relevant literatures are referred to to show the panorama of the graph Neural networks.
Journal ArticleDOI

Graph Self-Supervised Learning: A Survey

TL;DR: In this paper , the authors present a comprehensive review of graph self-supervised learning (SSL) techniques for graph data and present a unified framework that mathematically formalizes the paradigm of graph SSL.
Posted Content

Self-supervised Learning on Graphs: Contrastive, Generative,or Predictive.

TL;DR: Recently, self-supervised learning (SSL) is emerging as a new paradigm for extracting informative knowledge through well-designed pretext tasks without relying on manual labels as mentioned in this paper, which can be classified into three categories: contrastive, generative, and predictive.
Journal ArticleDOI

Low Complexity Recruitment for Collaborative Mobile Crowdsourcing Using Graph Neural Networks

TL;DR: In this paper, a low-complexity CMCS recruitment approach relying on Graph Neural Networks (GNNs), specifically graph embedding and clustering techniques, to shrink the workers' search space and afterwards, exploiting a meta-heuristic genetic algorithm to select appropriate workers.
References
More filters
Journal Article

Visualizing Data using t-SNE

TL;DR: A new technique called t-SNE that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map, a variation of Stochastic Neighbor Embedding that is much easier to optimize, and produces significantly better visualizations by reducing the tendency to crowd points together in the center of the map.
Proceedings Article

Distributed Representations of Words and Phrases and their Compositionality

TL;DR: This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling.
Posted Content

Semi-Supervised Classification with Graph Convolutional Networks

TL;DR: A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
Proceedings ArticleDOI

DeepWalk: online learning of social representations

TL;DR: DeepWalk as mentioned in this paper uses local information obtained from truncated random walks to learn latent representations by treating walks as the equivalent of sentences, which encode social relations in a continuous vector space, which is easily exploited by statistical models.
Posted Content

Inductive Representation Learning on Large Graphs

TL;DR: GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Related Papers (5)