scispace - formally typeset
Journal ArticleDOI

Multi-label graph node classification with label attentive neighborhood convolution

TLDR
This paper proposes an intuitive yet effective graph convolution module to aggregate local attribute information of a given node to obtain rational node feature representations and builds a label-aware representation learning framework to measure the compatibility between pairs of node embeddings and label embedDings.
Abstract
Learning with graph structured data is of great significance for many practical applications. A crucial and fundamental task in graph learning is node classification. In reality, graph nodes are often encoded with various attributes. In addition, the task is usually multi-labeled in nature. In this paper, we tackle the problem of multi-label graph node classification, by leveraging structure, attribute and label information simultaneously. Specifically, to obtain rational node feature representations, we propose an intuitive yet effective graph convolution module to aggregate local attribute information of a given node. Moreover, the homophily hypothesis motivates us to build a label attention module. By exploiting both input and output contextual representations, we utilize the additive attention mechanism and build a label-aware representation learning framework to measure the compatibility between pairs of node embeddings and label embeddings. The proposed novel neural network-based, multi-label classification method has been verified by extensive experiments conducted on five public-available benchmark datasets, including both attributed and non-attributed networks. The results demonstrate the effectiveness of the proposed model with respect to micro-F1, macro-F1 and Hamming loss, comparing with several state-of-the-art methods, including two relational neighbor classifiers and several popular graph neural network models.

read more

Citations
More filters
Journal ArticleDOI

Sentiment classification using attention mechanism and bidirectional long short-term memory network

TL;DR: This study implements the implementation of the attention mechanism in a deep learning network for analyzing large scale social media data and demonstrates the efficacy of the proposed sentiment classification method.
Proceedings ArticleDOI

Semi-supervised Multi-label Learning for Graph-structured Data

TL;DR: Zhang et al. as discussed by the authors incorporate label embedding into the proposed model to capture both network topology and higher-order multi-label correlations, which significantly outperforms the state-of-the-art models for node-level classification.
Proceedings ArticleDOI

Capsule Graph Neural Networks with EM Routing

TL;DR: CapsGNNEM as mentioned in this paper uses the EM routing mechanism to generate high-quality graph embeddings and outperforms nine state-of-the-art models in graph classification tasks.
Journal ArticleDOI

Multi-label Node Classification On Graph-Structured Data

TL;DR: In this paper , a multi-label graph generator is proposed to generate datasets with tunable properties for multi-class node classification, and a new approach dynamically fuses the feature and label correlation information to learn label-informed representations.
References
More filters
Posted Content

Neural Machine Translation by Jointly Learning to Align and Translate

TL;DR: In this paper, the authors propose to use a soft-searching model to find the parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly.
Proceedings ArticleDOI

Convolutional Neural Networks for Sentence Classification

TL;DR: The CNN models discussed herein improve upon the state of the art on 4 out of 7 tasks, which include sentiment analysis and question classification, and are proposed to allow for the use of both task-specific and static vectors.
Posted Content

Convolutional Neural Networks for Sentence Classification

TL;DR: In this article, CNNs are trained on top of pre-trained word vectors for sentence-level classification tasks and a simple CNN with little hyperparameter tuning and static vectors achieves excellent results on multiple benchmarks.
Journal ArticleDOI

A Comprehensive Survey on Graph Neural Networks

TL;DR: This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.
Proceedings Article

Convolutional neural networks on graphs with fast localized spectral filtering

TL;DR: This work presents a formulation of CNNs in the context of spectral graph theory, which provides the necessary mathematical background and efficient numerical schemes to design fast localized convolutional filters on graphs.
Related Papers (5)