Enhancing Graph Neural Network-based Fraud Detectors against Camouflaged Fraudsters
Yingtong Dou,Zhiwei Liu,Li Sun,Yutong Deng,Hao Peng,Philip S. Yu +5 more
- pp 315-324
Reads0
Chats0
TLDR
CARE-GNN as mentioned in this paper proposes a new model named CAmouflage-REsistant GNN (CAREGNN) to enhance the GNN aggregation process with three unique modules against camouflages.Abstract:
Graph Neural Networks (GNNs) have been widely applied to fraud detection problems in recent years, revealing the suspiciousness of nodes by aggregating their neighborhood information via different relations. However, few prior works have noticed the camouflage behavior of fraudsters, which could hamper the performance of GNN-based fraud detectors during the aggregation process. In this paper, we introduce two types of camouflages based on recent empirical studies, i.e., the feature camouflage and the relation camouflage. Existing GNNs have not addressed these two camouflages, which results in their poor performance in fraud detection problems. Alternatively, we propose a new model named CAmouflage-REsistant GNN (CARE-GNN), to enhance the GNN aggregation process with three unique modules against camouflages. Concretely, we first devise a label-aware similarity measure to find informative neighboring nodes. Then, we leverage reinforcement learning (RL) to find the optimal amounts of neighbors to be selected. Finally, the selected neighbors across different relations are aggregated together. Comprehensive experiments on two real-world fraud datasets demonstrate the effectiveness of the RL algorithm. The proposed CARE-GNN also outperforms state-of-the-art GNNs and GNN-based fraud detectors. We integrate all GNN-based fraud detectors as an opensource toolbox https://github.com/safe-graph/DGFraud. The CARE-GNN code and datasets are available at https://github.com/YingtongDou/CARE-GNN.read more
Citations
More filters
Posted Content
Graph Neural Networks with Heterophily
TL;DR: The proposed framework incorporates an interpretable compatibility matrix for modeling the heterophily or homophily level in the graph, which can be learned in an end-to-end fashion, enabling it to go beyond the assumption of stronghomophily.
Proceedings ArticleDOI
Pick and Choose: A GNN-based Imbalanced Learning Approach for Fraud Detection
TL;DR: Zhang et al. as discussed by the authors proposed a Pick and Choose Graph Neural Network (PC-GNN) for imbalanced supervised learning on graphs, where nodes and edges are picked with a devised label-balanced sampler to construct sub-graphs for mini-batch training.
Proceedings ArticleDOI
Few-shot Network Anomaly Detection via Cross-network Meta-learning
TL;DR: Wang et al. as discussed by the authors proposed a new family of graph neural networks (GDN) that can leverage a small number of labeled anomalies for enforcing statistically significant deviations between abnormal and normal nodes on a network; and equipping the proposed GDN with a new cross-network meta-learning algorithm to realize few-shot network anomaly detection by transferring meta-knowledge from multiple auxiliary networks.
Proceedings ArticleDOI
ConsisRec: Enhancing GNN for Social Recommendation via Consistent Neighbor Aggregation
TL;DR: Zhang et al. as discussed by the authors proposed to sample consistent neighbors by relating sampling probability with consistency scores between neighbors, and employ the relation attention mechanism to assign consistent relations with high importance factors for aggregation.
Proceedings ArticleDOI
Dynamic Graph Collaborative Filtering
TL;DR: In this paper, the authors propose Dynamic Graph Collaborative Filtering (DGCF), a framework leveraging dynamic graphs to capture collaborative and sequential relations of both items and users at the same time.
References
More filters
Posted Content
Semi-Supervised Classification with Graph Convolutional Networks
Thomas Kipf,Max Welling +1 more
TL;DR: A scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs which outperforms related methods by a significant margin.
Posted Content
Inductive Representation Learning on Large Graphs
TL;DR: GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Journal ArticleDOI
A Comprehensive Survey on Graph Neural Networks
TL;DR: This article provides a comprehensive overview of graph neural networks (GNNs) in data mining and machine learning fields and proposes a new taxonomy to divide the state-of-the-art GNNs into four categories, namely, recurrent GNNS, convolutional GNN’s, graph autoencoders, and spatial–temporal Gnns.
Book ChapterDOI
Modeling Relational Data with Graph Convolutional Networks
Michael Sejr Schlichtkrull,Thomas Kipf,Peter Bloem,Rianne van den Berg,Ivan Titov,Ivan Titov,Max Welling,Max Welling +7 more
TL;DR: It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.
Posted Content
Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour
Priya Goyal,Piotr Dollár,Ross Girshick,Pieter Noordhuis,Lukasz Wesolowski,Aapo Kyrola,Andrew Tulloch,Yangqing Jia,Kaiming He +8 more
TL;DR: This paper empirically show that on the ImageNet dataset large minibatches cause optimization difficulties, but when these are addressed the trained networks exhibit good generalization and enable training visual recognition models on internet-scale data with high efficiency.