scispace - formally typeset
A

Anit Kumar Sahu

Researcher at Bosch

Publications -  67
Citations -  5305

Anit Kumar Sahu is an academic researcher from Bosch. The author has contributed to research in topics: Computer science & Independent and identically distributed random variables. The author has an hindex of 17, co-authored 55 publications receiving 2353 citations. Previous affiliations of Anit Kumar Sahu include Carnegie Mellon University & Tufts University.

Papers
More filters
Journal ArticleDOI

Federated Learning: Challenges, Methods, and Future Directions

TL;DR: In this paper, the authors discuss the unique characteristics and challenges of federated learning, provide a broad overview of current approaches, and outline several directions of future work that are relevant to a wide range of research communities.

Federated Optimization in Heterogeneous Networks

TL;DR: This work introduces a framework, FedProx, to tackle heterogeneity in federated networks, and provides convergence guarantees for this framework when learning over data from non-identical distributions (statistical heterogeneity), and while adhering to device-level systems constraints by allowing each participating device to perform a variable amount of work.
Posted Content

Federated Optimization in Heterogeneous Networks

TL;DR: FedProx as discussed by the authors is a generalization and re-parametrization of FedAvg, which is the state-of-the-art method for federated learning.
Posted Content

On the Convergence of Federated Optimization in Heterogeneous Networks.

TL;DR: This work proposes and introduces \fedprox, which is similar in spirit to \fedavg, but more amenable to theoretical analysis, and describes the convergence of \fed Prox under a novel \textit{device similarity} assumption.
Posted Content

MATCHA: Speeding Up Decentralized SGD via Matching Decomposition Sampling

TL;DR: A novel algorithm MATCHA is proposed that uses matching decomposition sampling of the base topology to parallelize inter-worker information exchange so as to significantly reduce communication delay and communicates more frequently over critical links such that it can maintain the same convergence rate as vanilla decentralized SGD.