scispace - formally typeset
W

Wonyong Jeong

Researcher at KAIST

Publications -  13
Citations -  184

Wonyong Jeong is an academic researcher from KAIST. The author has contributed to research in topics: Computer science & Semi-supervised learning. The author has an hindex of 5, co-authored 10 publications receiving 60 citations. Previous affiliations of Wonyong Jeong include Brookhaven National Laboratory & Stony Brook University.

Papers
More filters
Posted Content

Federated Semi-Supervised Learning with Inter-Client Consistency.

TL;DR: FedMatch improves upon naive federated semi-supervised learning approaches with a new inter-client consistency loss and decomposition of the parameters into parameters for labeled and unlabeled data.
Posted Content

Federated Continual Learning with Weighted Inter-client Transfer.

TL;DR: A novel federated continual learning framework, Federated Weighted Inter-client Transfer (FedWeIT), is proposed, which decomposes the network weights into global federated parameters and sparse task-specific parameters, and each client receives selective knowledge from other clients by taking a weighted combination of their task- specific parameters.
Posted Content

Federated Continual Learning with Adaptive Parameter Communication.

TL;DR: This work proposes a novel federated continual learning framework, Federated continualLearning with Adaptive Parameter Communication, which additively decomposes the network weights into global shared parameters and sparse task-specific parameters and allows inter-client knowledge transfer by communicating the sparse Task Specific parameters.
Proceedings Article

Federated Semi-Supervised Learning with Inter-Client Consistency & Disjoint Learning

TL;DR: In this paper, a federated semi-supervised learning (FSSL) approach is proposed to solve the problem of data deficiency in real-world federated learning, where the private data at each client may be either partially labeled, or completely unlabeled with labeled data being available only at the server.
Proceedings Article

Federated Continual Learning with Weighted Inter-client Transfer

TL;DR: In this paper, the authors propose a federated weighted inter-client transfer (FedWeIT) framework, which decomposes the network weights into global federated parameters and sparse task-specific parameters, and each client receives selective knowledge from other clients by taking a weighted combination of their task specific parameters.