Adaptive Federated Learning in Resource Constrained Edge Computing Systems
Shiqiang Wang,Tiffany Tuor,Theodoros Salonidis,Kin K. Leung,Christian Makaya,Ting He,Kevin S. Chan +6 more
Reads0
Chats0
TLDR
In this paper, the authors consider the problem of learning model parameters from data distributed across multiple edge nodes, without sending raw data to a centralized place, and propose a control algorithm that determines the best tradeoff between local update and global parameter aggregation to minimize the loss function under a given resource budget.Abstract:
Emerging technologies and applications including Internet of Things, social networking, and crowd-sourcing generate large amounts of data at the network edge. Machine learning models are often built from the collected data, to enable the detection, classification, and prediction of future events. Due to bandwidth, storage, and privacy concerns, it is often impractical to send all the data to a centralized location. In this paper, we consider the problem of learning model parameters from data distributed across multiple edge nodes, without sending raw data to a centralized place. Our focus is on a generic class of machine learning models that are trained using gradient-descent-based approaches. We analyze the convergence bound of distributed gradient descent from a theoretical point of view, based on which we propose a control algorithm that determines the best tradeoff between local update and global parameter aggregation to minimize the loss function under a given resource budget. The performance of the proposed algorithm is evaluated via extensive experiments with real datasets, both on a networked prototype system and in a larger-scale simulated environment. The experimentation results show that our proposed approach performs near to the optimum with various machine learning models and different data distributions.read more
Citations
More filters
Journal ArticleDOI
Communication-Efficient Federated Learning for Digital Twin Edge Networks in Industrial IoT
TL;DR: The digital twin edge networks (DITENs) are proposed by incorporating digital twin into edge networks to fill the gap between physical systems and digital spaces and leverage the federated learning to construct digital twin models of IoT devices based on their running data.
Journal ArticleDOI
Learning in the Air: Secure Federated Learning for UAV-Assisted Crowdsensing
TL;DR: The results demonstrate that the proposed SFAC can effectively improve utilities for UAVs, promote high-quality model sharing, and ensure privacy protection in federated learning, compared with existing schemes.
Journal ArticleDOI
Client Selection and Bandwidth Allocation in Wireless Federated Learning Networks: A Long-Term Perspective
Jie Xu,Heqiang Wang +1 more
TL;DR: In this paper, a stochastic optimization problem for joint client selection and bandwidth allocation under long-term client energy constraints is formulated, and a new algorithm that utilizes only currently available wireless channel information but can achieve longterm performance guarantee is proposed.
Journal Article
Mime: Mimicking Centralized Stochastic Algorithms in Federated Learning
Sai Praneeth Karimireddy,Martin Jaggi,Satyen Kale,Mehryar Mohri,Sashank J. Reddi,Sebastian U. Stich,Ananda Theertha Suresh +6 more
TL;DR: This work proposes a general framework Mime which mitigates client-drift and adapts arbitrary centralized optimization algorithms to federated learning and strongly establishes Mime's superiority over other baselines.
Posted Content
Overcoming Forgetting in Federated Learning on Non-IID Data.
Neta Shoham,Tomer Avidor,Aviv Keren,Nadav Israel,Daniel Benditkis,Liron Mor-Yosef,Itai Zeitak +6 more
TL;DR: This work adds a penalty term to the loss function, compelling all local models to converge to a shared optimum, and shows that this can be done efficiently for communication (adding no further privacy risks), scaling with the number of nodes in the distributed setting.