scispace - formally typeset
T

Tao Fan

Researcher at Tencent

Publications -  6
Citations -  512

Tao Fan is an academic researcher from Tencent. The author has contributed to research in topics: Computer science & Encryption. The author has an hindex of 3, co-authored 4 publications receiving 234 citations.

Papers
More filters
Journal ArticleDOI

SecureBoost: A Lossless Federated Learning Framework

TL;DR: The SecureBoost framework is shown to be as accurate as other nonfederated gradient tree-boosting algorithms that require centralized data, and thus, it is highly scalable and practical for industrial applications such as credit risk analysis.
Posted Content

SecureBoost: A Lossless Federated Learning Framework

TL;DR: SecureBoost as mentioned in this paper is a lossless privacy-preserving tree-boosting system for federated learning, which allows the learning process to be jointly conducted over multiple parties with common user samples but different feature sets, which corresponds to vertically partitioned data set.
Posted Content

A Quasi-Newton Method Based Vertical Federated Learning Framework for Logistic Regression

TL;DR: This paper proposes a quasi-Newton method based vertical federated learning framework for logistic regression under the additively homomorphic encryption scheme and can considerably reduce the number of communication rounds with a little additional communication cost per round.
Journal ArticleDOI

Accelerating Vertical Federated Learning

TL;DR: A straggler-resilient and computation-efficient accelerating system that reduces the communication overhead in heterogeneous scenarios by 65.26% at most and reduces the computation overhead caused by homomorphic encryption by 40.66%" at most is proposed.

SecureBoost Hyperparameter Tuning via Multi-Objective Federated Learning

TL;DR: Wang et al. as mentioned in this paper proposed a constrained multi-objective SecureBoost (CMOSB) algorithm to find Pareto optimal solutions that each solution is a set of hyperparameters achieving optimal tradeoff between utility loss, training cost, and privacy leakage.