scispace - formally typeset
Open AccessJournal ArticleDOI

Federated Learning: A Survey on Enabling Technologies, Protocols, and Applications.

TLDR
A more thorough summary of the most relevant protocols, platforms, and real-life use-cases of FL is provided to enable data scientists to build better privacy-preserved solutions for industries in critical need of FL.
Abstract
This paper provides a comprehensive study of Federated Learning (FL) with an emphasis on enabling software and hardware platforms, protocols, real-life applications and use-cases. FL can be applicable to multiple domains but applying it to different industries has its own set of obstacles. FL is known as collaborative learning, where algorithm(s) get trained across multiple devices or servers with decentralized data samples without having to exchange the actual data. This approach is radically different from other more established techniques such as getting the data samples uploaded to servers or having data in some form of distributed infrastructure. FL on the other hand generates more robust models without sharing data, leading to privacy-preserved solutions with higher security and access privileges to data. This paper starts by providing an overview of FL. Then, it gives an overview of technical details that pertain to FL enabling technologies, protocols, and applications. Compared to other survey papers in the field, our objective is to provide a more thorough summary of the most relevant protocols, platforms, and real-life use-cases of FL to enable data scientists to build better privacy-preserving solutions for industries in critical need of FL. We also provide an overview of key challenges presented in the recent literature and provide a summary of related research work. Moreover, we explore both the challenges and advantages of FL and present detailed service use-cases to illustrate how different architectures and protocols that use FL can fit together to deliver desired results.

read more

Content maybe subject to copyright    Report

Citations
More filters
Book ChapterDOI

Towards Blockchain-Based Fair and Trustworthy Federated Learning Systems

TL;DR: In this paper, the key trust requirements for decentralized federated learning (DFL) are reviewed and compared in terms of fairness, trust, and privacy in order to develop fair and trustworthy FL systems.

Bulletin of Electrical Engineering and Informatics

TL;DR: This study proposed a new method called the robust stochastic DEA (RSDEA) to approach performance efficiency in tackling uncertainty problems (i.e., stochastically and robust optimization) and demonstrates the performance efficiency of the proposed formulation method.
Journal ArticleDOI

ADFL: Defending backdoor attacks in federated learning via adversarial distillation

TL;DR: ADFL as discussed by the authors generates fake samples containing backdoor features by deploying a generative adversarial network (GAN) on the server side and relabeling the fake samples to obtain the distillation dataset.

Cognitive Health Assessment of Decentralized Smart home Activities using Federated Learning

TL;DR: In this article , a federated learning approach based on deep neural networks is proposed to address the concern that cognitive health deteriorates over time, it can often go unnoticed until it is too late.
Proceedings ArticleDOI

Multi-center Federated Learning with Model Decoupling

TL;DR: In this article , the authors proposed a simple and efficient personalized federated learning framework named FedMDC to solve the data heterogeneity problem, which effectively combines the idea of clustering and model decoupling and provides each participant with a personalized model suitable for their own local data distribution.
References
More filters
Proceedings Article

Adam: A Method for Stochastic Optimization

TL;DR: This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
Posted Content

An overview of gradient descent optimization algorithms

Sebastian Ruder
- 15 Sep 2016 - 
TL;DR: This article looks at different variants of gradient descent, summarize challenges, introduce the most common optimization algorithms, review architectures in a parallel and distributed setting, and investigate additional strategies for optimizing gradient descent.
Journal ArticleDOI

Federated Machine Learning: Concept and Applications

TL;DR: This work introduces a comprehensive secure federated-learning framework, which includes horizontal federated learning, vertical federatedLearning, and federated transfer learning, and provides a comprehensive survey of existing works on this subject.
Journal ArticleDOI

Big data analytics in healthcare: promise and potential

TL;DR: Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs, and its potential is great; however there remain challenges to overcome.
Journal ArticleDOI

Federated Learning: Challenges, Methods, and Future Directions

TL;DR: In this paper, the authors discuss the unique characteristics and challenges of federated learning, provide a broad overview of current approaches, and outline several directions of future work that are relevant to a wide range of research communities.
Related Papers (5)