Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation
Jichang Li,Guanbin Li,Yemin Shi,Yizhou Yu +3 more
- pp 2505-2514
Reads0
Chats0
TLDR
Xia et al. as discussed by the authors proposed a cross-domain adaptive clustering loss to group features of unlabeled target data into clusters and perform cluster-wise feature alignment across the source and target domains.Abstract:
In semi-supervised domain adaptation, a few labeled samples per class in the target domain guide features of the remaining target samples to aggregate around them. However, the trained model cannot produce a highly discriminative feature representation for the target domain because the training data is dominated by labeled samples from the source domain. This could lead to disconnection between the labeled and unlabeled target samples as well as misalignment between unlabeled target samples and the source domain. In this paper, we propose a novel approach called Cross-domain Adaptive Clustering to address this problem. To achieve both inter-domain and intra-domain adaptation, we first introduce an adversarial adaptive clustering loss to group features of unlabeled target data into clusters and perform cluster-wise feature alignment across the source and target domains. We further apply pseudo labeling to unlabeled samples in the target domain and retain pseudo-labels with high confidence. Pseudo labeling expands the number of "labeled" samples in each class in the target domain, and thus produces a more robust and powerful cluster core for each class to facilitate adversarial learning. Extensive experiments on benchmark datasets, including DomainNet, Office-Home and Office, demonstrate that our proposed approach achieves the state-of-the-art performance in semi-supervised domain adaptation.read more
Citations
More filters
Journal ArticleDOI
Semi-supervised bidirectional alignment for Remote Sensing cross-domain scene classification
TL;DR: Wu et al. as discussed by the authors proposed a bidirectional sample-class alignment (BSCA) for remote sensing image scene classification, which consists of two alignment strategies, unsupervised alignment (UA) and supervised alignment (SA), which can contribute to decreasing domain shift.
Proceedings ArticleDOI
Divide and Contrast: Source-free Domain Adaptation via Adaptive Contrastive Learning
TL;DR: Zhang et al. as mentioned in this paper divide the target data into source-like and target-specific samples, where either group of samples is treated with tailored goals under an adaptive contrastive learning framework.
Proceedings ArticleDOI
Multi-level Consistency Learning for Semi-supervised Domain Adaptation
TL;DR: The proposed Multi-level Consistency Learning (MCL) framework for SSDA regularizes the consistency of different views of target domain samples at three levels: at inter-domain level, it robustly and accurately align the source and target domains using a prototype-based optimal transport method.
Posted Content
Surprisingly Simple Semi-Supervised Domain Adaptation with Pretraining and Consistency.
TL;DR: In this article, the authors show that in the presence of a few target labels, simple techniques like self-supervision and consistency regularization can be effective without any adversarial alignment to learn a good target classifier.
Journal ArticleDOI
Context-guided entropy minimization for semi-supervised domain adaptation
Ning Ma,Jiajun Bu,Lixian Lu,Jun Wen,Shenglu Zhou,Zhen Zhang,Jing-Jun Gu,Haifeng Li,Xifeng Yan +8 more
TL;DR: Zhang et al. as discussed by the authors proposed to guide entropy minimization via longitudinal self-distillation, which produces a dynamic "teacher" label distribution during training by constructing a graph on target data and perform pseudo-label propagation to encourage the teacher distribution to capture context category dependency based on a global data structure.
References
More filters
Journal Article
Visualizing Data using t-SNE
TL;DR: A new technique called t-SNE that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map, a variation of Stochastic Neighbor Embedding that is much easier to optimize, and produces significantly better visualizations by reducing the tendency to crowd points together in the center of the map.
Journal ArticleDOI
A Survey on Transfer Learning
Sinno Jialin Pan,Qiang Yang +1 more
TL;DR: The relationship between transfer learning and other related machine learning techniques such as domain adaptation, multitask learning and sample selection bias, as well as covariate shift are discussed.
Book ChapterDOI
Domain-adversarial training of neural networks
Yaroslav Ganin,Evgeniya Ustinova,Hana Ajakan,Pascal Germain,Hugo Larochelle,François Laviolette,Mario Marchand,Victor Lempitsky +7 more
TL;DR: In this article, a new representation learning approach for domain adaptation is proposed, in which data at training and test time come from similar but different distributions, and features that cannot discriminate between the training (source) and test (target) domains are used to promote the emergence of features that are discriminative for the main learning task on the source domain.
Proceedings ArticleDOI
Adversarial Discriminative Domain Adaptation
TL;DR: Adversarial Discriminative Domain Adaptation (ADDA) as mentioned in this paper combines discriminative modeling, untied weight sharing, and a generative adversarial network (GAN) loss.
Journal ArticleDOI
A kernel two-sample test
TL;DR: This work proposes a framework for analyzing and comparing distributions, which is used to construct statistical tests to determine if two samples are drawn from different distributions, and presents two distribution free tests based on large deviation bounds for the maximum mean discrepancy (MMD).