scispace - formally typeset
R

Rongfei Jia

Researcher at Alibaba Group

Publications -  5
Citations -  125

Rongfei Jia is an academic researcher from Alibaba Group. The author has contributed to research in topics: MNIST database & Scalability. The author has an hindex of 2, co-authored 5 publications receiving 41 citations.

Papers
More filters
Proceedings ArticleDOI

Billion-scale federated learning on mobile clients: a submodel design with tunable privacy

TL;DR: A secure federated submodel learning scheme coupled with a private set union protocol as a cornerstone is designed, which features the properties of randomized response, secure aggregation, and Bloom filter, and endows each client with customized plausible deniability against the position of its desired submodel, thereby protecting private data.
Posted Content

Secure Federated Submodel Learning.

TL;DR: A secure federated submodel learning scheme coupled with a private set union protocol as a cornerstone is designed, which features the properties of randomized response, secure aggregation, and Bloom filter, and endows each client with a customized plausible deniability against the position of her desired submodel, thus protecting her private data.
Posted Content

Distributed Optimization over Block-Cyclic Data.

TL;DR: Two new distributed optimization algorithms called multi-model parallel SGD (MM-PSGD) and multi-chain parallelSGD (MC-PS GD) are proposed with a convergence rate of $O(1/\sqrt{NT})$, achieving a linear speedup with respect to the total number of clients.
Proceedings ArticleDOI

Data-Free Evaluation of User Contributions in Federated Learning

TL;DR: Zhang et al. as discussed by the authors proposed a method called Pairwise Correlated Agreement (PCA) based on the idea of peer prediction to evaluate user contribution in FL without a test dataset, which achieves this using the statistical correlation of the model parameters uploaded by users.
Posted Content

Data-Free Evaluation of User Contributions in Federated Learning

TL;DR: In this article, the authors proposed Pairwise Correlated Agreement (PCA) based on the idea of peer prediction to evaluate user contribution in federated learning without a test dataset, which achieves this using the statistical correlation of the model parameters uploaded by users.