Journal ArticleDOI
FLEE: A Hierarchical Federated Learning Framework for Distributed Deep Neural Network over Cloud, Edge, and End Device
TLDR
This article comprehensively considers various data distributions on end devices and edges, proposing a hierarchical federated learning framework FLEE, which can realize dynamical updates of models without redeploying them and can improve model performances under all kinds of data distributions.Abstract:
With the development of smart devices, the computing capabilities of portable end devices such as mobile phones have been greatly enhanced. Meanwhile, traditional cloud computing faces great challenges caused by privacy-leakage and time-delay problems, there is a trend to push models down to edges and end devices. However, due to the limitation of computing resource, it is difficult for end devices to complete complex computing tasks alone. Therefore, this article divides the model into two parts and deploys them on multiple end devices and edges, respectively. Meanwhile, an early exit is set to reduce computing resource overhead, forming a hierarchical distributed architecture. In order to enable the distributed model to continuously evolve by using new data generated by end devices, we comprehensively consider various data distributions on end devices and edges, proposing a hierarchical federated learning framework FLEE, which can realize dynamical updates of models without redeploying them. Through image and sentence classification experiments, we verify that it can improve model performances under all kinds of data distributions, and prove that compared with other frameworks, the models trained by FLEE consume less global computing resource in the inference stage.read more
Citations
More filters
Journal ArticleDOI
Advancements in Federated Learning: Models, Methods, and Privacy
TL;DR: In this paper , the authors conducted a thorough review of the related works, following the development context and deeply mining the key technologies behind FL from both theoretical and practical perspectives, and proposed solutions for model training via FL.
Proceedings ArticleDOI
Async-HFL: Efficient and Robust Asynchronous Federated Learning in Hierarchical IoT Networks
Xiaofan Yu,Ludmila Cherkasova,Hars Vardhan,Quanling Zhao,Emily Ekaireb,Xiyuan Zhang,Arya Mazumdar,Tajana Rosing +7 more
TL;DR: Async-HFL as discussed by the authors proposes an asynchronous and hierarchical framework for federated learning in a common three-tier IoT network architecture, which employs asynchronous aggregations at both the gateway and cloud levels to avoid long waiting time.
Journal ArticleDOI
Federated Momentum Contrastive Clustering
Runxuan Miao,Erdem Koyuncu +1 more
TL;DR: In this paper , a transformed data pair passes through both the online and target networks, resulting in four representations over which the losses are determined, and the resulting high-quality representations generated by FedMCC can outperform several existing self-supervised learning methods for linear evaluation and semi-Supervised learning tasks.
Journal ArticleDOI
Reducing Impacts of System Heterogeneity in Federated Learning using Weight Update Magnitudes
TL;DR: This work aims to mitigate the performance bottleneck of federated learning by dynamically forming sub-models for stragglers based on their performance and accuracy feedback, and offers the Invariant Dropout, a dynamic technique that forms a sub-model based on the neuron update threshold.
Journal ArticleDOI
FLuID: Mitigating Stragglers in Federated Learning using Invariant Dropout
TL;DR: In this article , the authors proposed an adaptive training framework, Federated Learning using Invariant Dropout (FLuID), which leverages neuron updates from non-straggler devices to construct a tailored submodel for each straggler based on client performance profiling.
References
More filters
Journal ArticleDOI
ImageNet classification with deep convolutional neural networks
TL;DR: A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Posted Content
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew Howard,Menglong Zhu,Bo Chen,Dmitry Kalenichenko,Weijun Wang,Tobias Weyand,M. Andreetto,Hartwig Adam +7 more
TL;DR: This work introduces two simple global hyper-parameters that efficiently trade off between latency and accuracy and demonstrates the effectiveness of MobileNets across a wide range of applications and use cases including object detection, finegrain classification, face attributes and large scale geo-localization.
Proceedings ArticleDOI
Convolutional Neural Networks for Sentence Classification
TL;DR: The CNN models discussed herein improve upon the state of the art on 4 out of 7 tasks, which include sentiment analysis and question classification, and are proposed to allow for the use of both task-specific and static vectors.
Posted Content
Communication-Efficient Learning of Deep Networks from Decentralized Data
TL;DR: This work presents a practical method for the federated learning of deep networks based on iterative model averaging, and conducts an extensive empirical evaluation, considering five different model architectures and four datasets.
Book ChapterDOI
XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks
TL;DR: The Binary-Weight-Network version of AlexNet is compared with recent network binarization methods, BinaryConnect and BinaryNets, and outperform these methods by large margins on ImageNet, more than \(16\,\%\) in top-1 accuracy.