scispace - formally typeset
Journal ArticleDOI

Federated Dynamic Graph Neural Networks with Secure Aggregation for Video-based Distributed Surveillance

Meng Jiang, +3 more
- 04 Feb 2022 - 
- Vol. 13, Iss: 4, pp 1-23
TLDR
This work introduces Federated Dynamic Graph Neural Network (Feddy), a distributed and secured framework to learn the object representations from graph sequences that aggregates structural information from nearby objects in the current graph as well as dynamic information from those in the previous graph.
Abstract
Distributed surveillance systems have the ability to detect, track, and snapshot objects moving around in a certain space. The systems generate video data from multiple personal devices or street cameras. Intelligent video-analysis models are needed to learn dynamic representation of the objects for detection and tracking. Can we exploit the structural and dynamic information without storing the spatiotemporal video data at a central server that leads to a violation of user privacy? In this work, we introduce Federated Dynamic Graph Neural Network (Feddy), a distributed and secured framework to learn the object representations from graph sequences: (1) It aggregates structural information from nearby objects in the current graph as well as dynamic information from those in the previous graph. It uses a self-supervised loss of predicting the trajectories of objects. (2) It is trained in a federated learning manner. The centrally located server sends the model to user devices. Local models on the respective user devices learn and periodically send their learning to the central server without ever exposing the user’s data to server. (3) Studies showed that the aggregated parameters could be inspected though decrypted when broadcast to clients for model synchronizing, after the server performed a weighted average. We design an appropriate aggregation mechanism of secure aggregation primitives that can protect the security and privacy in federated learning with scalability. Experiments on four video camera datasets as well as simulation demonstrate that Feddy achieves great effectiveness and security.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Trustworthy Graph Neural Networks: Aspects, Methods and Trends

TL;DR: A comprehensive roadmap to build trustworthy GNNs from the view of the various computing technologies involved is proposed, including robustness, explainability, privacy, fairness, accountability, and environmental well-being.
Proceedings ArticleDOI

Graph Rationalization with Environment-based Augmentations

TL;DR: A new augmentation operation called environment replacement is introduced that automatically creates virtual data examples to improve rationale identification and performs rationale-environment separation and representation learning on the real and augmented examples in latent spaces to avoid the high complexity of explicit graph decoding and encoding.
Journal Article

Federated Graph Neural Networks: Overview, Techniques and Challenges

Ruiqiang Li, +1 more
- 15 Feb 2022 - 
TL;DR: A unique 3-tiered taxonomy of the FedGNNs literature is proposed to provide a clear view into how GNNs work in the context of Federated Learning (FL), which puts existing works into perspective.
Journal ArticleDOI

Federated Learning on Non-IID Graphs via Structural Knowledge Sharing

TL;DR: FedStar as mentioned in this paper proposes an FGL framework that extracts and shares the common underlying structure information for inter-graph federated learning tasks, which can capture more structure-based domain-invariant information and avoid feature misalignment issues.
Journal ArticleDOI

Federated Graph Machine Learning: A Survey of Concepts, Techniques, and Applications

TL;DR: This survey conducts a comprehensive review of the literature in Federated Graph Machine Learning and provides a new taxonomy to divide the existing problems in FGML into two settings, namely, FL with structured data and structured FL .
References
More filters
Posted Content

Inductive Representation Learning on Large Graphs

TL;DR: GraphSAGE is presented, a general, inductive framework that leverages node feature information (e.g., text attributes) to efficiently generate node embeddings for previously unseen data and outperforms strong baselines on three inductive node-classification benchmarks.
Posted Content

Communication-Efficient Learning of Deep Networks from Decentralized Data

TL;DR: This work presents a practical method for the federated learning of deep networks based on iterative model averaging, and conducts an extensive empirical evaluation, considering five different model architectures and four datasets.
Posted Content

Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering

TL;DR: In this article, a spectral graph theory formulation of convolutional neural networks (CNNs) was proposed to learn local, stationary, and compositional features on graphs, and the proposed technique offers the same linear computational complexity and constant learning complexity as classical CNNs while being universal to any graph structure.
Book ChapterDOI

Modeling Relational Data with Graph Convolutional Networks

TL;DR: It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.
Posted Content

Federated Learning: Strategies for Improving Communication Efficiency

TL;DR: Two ways to reduce the uplink communication costs are proposed: structured updates, where the user directly learns an update from a restricted space parametrized using a smaller number of variables, e.g. either low-rank or a random mask; and sketched updates, which learn a full model update and then compress it using a combination of quantization, random rotations, and subsampling.
Related Papers (5)