scispace - formally typeset
Open AccessJournal ArticleDOI

Adaptive Federated Learning in Resource Constrained Edge Computing Systems

Reads0
Chats0
TLDR
In this paper, the authors consider the problem of learning model parameters from data distributed across multiple edge nodes, without sending raw data to a centralized place, and propose a control algorithm that determines the best tradeoff between local update and global parameter aggregation to minimize the loss function under a given resource budget.
Abstract
Emerging technologies and applications including Internet of Things, social networking, and crowd-sourcing generate large amounts of data at the network edge. Machine learning models are often built from the collected data, to enable the detection, classification, and prediction of future events. Due to bandwidth, storage, and privacy concerns, it is often impractical to send all the data to a centralized location. In this paper, we consider the problem of learning model parameters from data distributed across multiple edge nodes, without sending raw data to a centralized place. Our focus is on a generic class of machine learning models that are trained using gradient-descent-based approaches. We analyze the convergence bound of distributed gradient descent from a theoretical point of view, based on which we propose a control algorithm that determines the best tradeoff between local update and global parameter aggregation to minimize the loss function under a given resource budget. The performance of the proposed algorithm is evaluated via extensive experiments with real datasets, both on a networked prototype system and in a larger-scale simulated environment. The experimentation results show that our proposed approach performs near to the optimum with various machine learning models and different data distributions.

read more

Citations
More filters
Journal ArticleDOI

LSFL: A Lightweight and Secure Federated Learning Scheme for Edge Computing

TL;DR: Li et al. as mentioned in this paper proposed a lightweight and secure federated learning scheme LSFL, which combines the features of privacy-preserving and Byzantine-robustness and utilizes two servers to enable secure Byzantine robustness and model aggregation.
Journal ArticleDOI

Performance Enhancement in Federated Learning by Reducing Class Imbalance of Non-IID Data

Mihye Seol, +1 more
- 19 Jan 2023 - 
TL;DR: In this article , the authors proposed an efficient algorithm for enhancing the performance of federated learning by overcoming the negative effects of non-independent and identically distributed (non-IID) datasets.
Journal ArticleDOI

A UAV-Aided Vehicular Integrated Platooning Network for Heterogeneous Resource Management

TL;DR: In this paper , a UAV-aided integrated platooning (VIP) vehicle network for energy consumption minimization based resource management is proposed, which consists of one phase called dynamic parameter server selection and the other phase called federated resource management.
Proceedings ArticleDOI

Heterogeneity-Aware Adaptive Federated Learning Scheduling

TL;DR: In this paper , the authors proposed a heterogeneity-aware scheduling which is adaptive to the accuracy trends and various resource usage patterns to mitigate effects of resource and data heterogeneity while providing adaptive scheduling based on dynamically changing resource usage pattern and accuracy trends.
Proceedings ArticleDOI

Federated Learning Using Variance Reduced Stochastic Gradient for Probabilistically Activated Agents

TL;DR: In this article , the authors proposed a two-layer structure for federated learning with variance reduction and a faster convergence rate to an optimal solution in the setting where each agent has an arbitrary probability of selection in each iteration.
Related Papers (5)