scispace - formally typeset
Open AccessJournal ArticleDOI

A Survey of Unsupervised Deep Domain Adaptation

TLDR
A survey will compare single-source and typically homogeneous unsupervised deep domain adaptation approaches, combining the powerful, hierarchical representations from deep learning with domain adaptation to reduce reliance on potentially costly target data labels.
Abstract
Deep learning has produced state-of-the-art results for a variety of tasks. While such approaches for supervised learning have performed well, they assume that training and testing data are drawn from the same distribution, which may not always be the case. As a complement to this challenge, single-source unsupervised domain adaptation can handle situations where a network is trained on labeled data from a source domain and unlabeled data from a related but different target domain with the goal of performing well at test-time on the target domain. Many single-source and typically homogeneous unsupervised deep domain adaptation approaches have thus been developed, combining the powerful, hierarchical representations from deep learning with domain adaptation to reduce reliance on potentially costly target data labels. This survey will compare these approaches by examining alternative methods, the unique and common elements, results, and theoretical insights. We follow this with a look at application areas and open research directions.

read more

Citations
More filters
Proceedings ArticleDOI

Neural Unsupervised Domain Adaptation in NLP—A Survey

TL;DR: This survey reviews neural unsupervised domain adaptation techniques which do not require labeled target domain data, and revisits the notion of domain, and uncovers a bias in the type of Natural Language Processing tasks which received most attention.
Journal ArticleDOI

Domain Adaptation for Medical Image Analysis: A Survey

TL;DR: In this article, a survey of domain adaptation methods for medical image analysis is presented, and the authors categorize the existing methods into shallow and deep models, and each of them is further divided into supervised, semi-supervised and unsupervised methods.
Journal ArticleDOI

Secure and Robust Machine Learning for Healthcare: A Survey

TL;DR: In this paper, the authors present an overview of various application areas in healthcare that leverage ML/DL from security and privacy point of view and present associated challenges and potential methods to ensure secure and privacy-preserving ML for healthcare applications.
Proceedings ArticleDOI

A Survey on Recent Approaches for Natural Language Processing in Low-Resource Scenarios.

TL;DR: A structured overview of methods that enable learning when training data is sparse including mechanisms to create additional labeled data like data augmentation and distant supervision as well as transfer learning settings that reduce the need for target supervision are given.
Journal ArticleDOI

Domain Adaptation for Medical Image Analysis: A Survey

TL;DR: In this article , a survey of domain adaptation methods for medical image analysis is presented, and the authors categorize the existing methods into shallow and deep models, and each of them is further divided into supervised, semi-supervised and unsupervised methods.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings ArticleDOI

ImageNet: A large-scale hierarchical image database

TL;DR: A new database called “ImageNet” is introduced, a large-scale ontology of images built upon the backbone of the WordNet structure, much larger in scale and diversity and much more accurate than the current image datasets.
Journal ArticleDOI

ImageNet Large Scale Visual Recognition Challenge

TL;DR: The ImageNet Large Scale Visual Recognition Challenge (ILSVRC) as mentioned in this paper is a benchmark in object category classification and detection on hundreds of object categories and millions of images, which has been run annually from 2010 to present, attracting participation from more than fifty institutions.
Journal ArticleDOI

A Survey on Transfer Learning

TL;DR: The relationship between transfer learning and other related machine learning techniques such as domain adaptation, multitask learning and sample selection bias, as well as covariate shift are discussed.
Posted Content

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Batch Normalization as mentioned in this paper normalizes layer inputs for each training mini-batch to reduce the internal covariate shift in deep neural networks, and achieves state-of-the-art performance on ImageNet.
Related Papers (5)