Open AccessProceedings Article
Information-Theoretical Learning of Discriminative Clusters for Unsupervised Domain Adaptation
TLDR
In this paper, the authors proposed to jointly learn domain-invariant features and discriminative feature space by optimizing an information-theoretic metric as an proxy to the expected misclassification error on the target domain.Abstract:Â
We study the problem of unsupervised domain adaptation, which aims to adapt classifiers trained on a labeled source domain to an unlabeled target domain. Many existing approaches first learn domain-invariant features and then construct classifiers with them. We propose a novel approach that jointly learn the both. Specifically, while the method identifies a feature space where data in the source and the target domains are similarly distributed, it also learns the feature space discriminatively, optimizing an information-theoretic metric as an proxy to the expected misclassification error on the target domain. We show how this optimization can be effectively carried out with simple gradient-based methods and how hyperparameters can be cross-validated without demanding any labeled data from the target domain. Empirical studies on benchmark tasks of object recognition and sentiment analysis validated our modeling assumptions and demonstrated significant improvement of our method over competing ones in classification accuracies.read more
Citations
More filters
Journal ArticleDOI
A survey of transfer learning
TL;DR: This survey paper formally defines transfer learning, presents information on current solutions, and reviews applications applied toTransfer learning, which can be applied to big data environments.
Journal ArticleDOI
Visual Domain Adaptation: A survey of recent advances
TL;DR: A survey of domain adaptation methods for visual recognition discusses the merits and drawbacks of existing domain adaptation approaches and identifies promising avenues for research in this rapidly evolving field.
Journal ArticleDOI
A Review of Domain Adaptation without Target Labels
Wouter M. Kouw,Marco Loog +1 more
TL;DR: In this paper, the authors present a categorization of domain adaptation methods into three types: sample-based, feature-based and inference-based methods, based on which a classifier learns from a source domain and generalizes to a target domain.
Proceedings ArticleDOI
Transfer Learning in Natural Language Processing.
TL;DR: Transfer learning as discussed by the authors is a set of methods that extend the classical supervised machine learning paradigm by leveraging data from additional domains or tasks to train a model with better generalization properties, which can be used for NLP tasks.
Proceedings ArticleDOI
Semi-supervised Domain Adaptation with Subspace Learning for visual recognition
TL;DR: A novel domain adaptation framework, named Semi-supervised Domain Adaptation with Subspace Learning (SDASL), which jointly explores invariant low-dimensional structures across domains to correct data distribution mismatch and leverages available unlabeled target examples to exploit the underlying intrinsic information in the target domain.
References
More filters
Journal ArticleDOI
A Survey on Transfer Learning
Sinno Jialin Pan,Qiang Yang +1 more
TL;DR: The relationship between transfer learning and other related machine learning techniques such as domain adaptation, multitask learning and sample selection bias, as well as covariate shift are discussed.
Book ChapterDOI
SURF: speeded up robust features
TL;DR: A novel scale- and rotation-invariant interest point detector and descriptor, coined SURF (Speeded Up Robust Features), which approximates or even outperforms previously proposed schemes with respect to repeatability, distinctiveness, and robustness, yet can be computed and compared much faster.
Proceedings Article
Distance Metric Learning for Large Margin Nearest Neighbor Classification
TL;DR: In this article, a Mahanalobis distance metric for k-NN classification is trained with the goal that the k-nearest neighbors always belong to the same class while examples from different classes are separated by a large margin.
Journal ArticleDOI
Distance Metric Learning for Large Margin Nearest Neighbor Classification
TL;DR: This paper shows how to learn a Mahalanobis distance metric for kNN classification from labeled examples in a globally integrated manner and finds that metrics trained in this way lead to significant improvements in kNN Classification.
Journal ArticleDOI
Domain Adaptation via Transfer Component Analysis
TL;DR: This work proposes a novel dimensionality reduction framework for reducing the distance between domains in a latent space for domain adaptation and proposes both unsupervised and semisupervised feature extraction approaches, which can dramatically reduce thedistance between domain distributions by projecting data onto the learned transfer components.