Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation
Yang Chen,Xiaoyan Sun,Yaochu Jin +2 more
Reads0
Chats0
TLDR
The results demonstrate that the proposed asynchronous federated deep learning outperforms the baseline algorithm both in terms of communication cost and model accuracy.Abstract:
Federated learning obtains a central model on the server by aggregating models trained locally on clients. As a result, federated learning does not require clients to upload their data to the server, thereby preserving the data privacy of the clients. One challenge in federated learning is to reduce the client–server communication since the end devices typically have very limited communication bandwidth. This article presents an enhanced federated learning technique by proposing an asynchronous learning strategy on the clients and a temporally weighted aggregation of the local models on the server. In the asynchronous learning strategy, different layers of the deep neural networks (DNNs) are categorized into shallow and deep layers, and the parameters of the deep layers are updated less frequently than those of the shallow layers. Furthermore, a temporally weighted aggregation strategy is introduced on the server to make use of the previously trained local models, thereby enhancing the accuracy and convergence of the central model. The proposed algorithm is empirically on two data sets with different DNNs. Our results demonstrate that the proposed asynchronous federated deep learning outperforms the baseline algorithm both in terms of communication cost and model accuracy.read more
Citations
More filters
Journal ArticleDOI
A survey on security and privacy of federated learning
Viraaji Mothukuri,Reza M. Parizi,Seyedamin Pouriyeh,Yan Huang,Ali Dehghantanha,Gautam Srivastava,Gautam Srivastava +6 more
TL;DR: This paper aims to provide a comprehensive study concerning FL’s security and privacy aspects that can help bridge the gap between the current state of federated AI and a future in which mass adoption is possible.
Journal ArticleDOI
Federated Learning for Healthcare Informatics
TL;DR: In this article, the authors provide a review of federated learning in the biomedical space, and summarize the general solutions to the statistical challenges, system challenges, and privacy issues in federated Learning, and point out the implications and potentials in healthcare.
Posted Content
Federated Learning for Healthcare Informatics
TL;DR: The goal of this survey is to provide a review for federated learning technologies, particularly within the biomedical space, and summarize the general solutions to the statistical challenges, system challenges, and privacy issues in federation, and point out the implications and potentials in healthcare.
Journal ArticleDOI
Federated Learning: A Survey on Enabling Technologies, Protocols, and Applications.
TL;DR: A more thorough summary of the most relevant protocols, platforms, and real-life use-cases of FL is provided to enable data scientists to build better privacy-preserved solutions for industries in critical need of FL.
Journal ArticleDOI
Communication-Efficient Federated Learning for Wireless Edge Intelligence in IoT
Jed Mills,Jia Hu,Geyong Min +2 more
TL;DR: This work proposes adapting FedAvg to use a distributed form of Adam optimization, greatly reducing the number of rounds to convergence, along with the novel compression techniques, to produce communication-efficient FedAvg (CE-FedAvg), which can converge to a target accuracy and is more robust to aggressive compression.
References
More filters
Proceedings Article
ImageNet Classification with Deep Convolutional Neural Networks
TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI
Deep learning
TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Journal ArticleDOI
Gradient-based learning applied to document recognition
Yann LeCun,Léon Bottou,Léon Bottou,Yoshua Bengio,Yoshua Bengio,Yoshua Bengio,Patrick Haffner +6 more
TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Book
Distributed Optimization and Statistical Learning Via the Alternating Direction Method of Multipliers
TL;DR: It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Posted Content
Communication-Efficient Learning of Deep Networks from Decentralized Data
TL;DR: This work presents a practical method for the federated learning of deep networks based on iterative model averaging, and conducts an extensive empirical evaluation, considering five different model architectures and four datasets.