Open AccessProceedings Article
Cross-Language Text Classification Using Structural Correspondence Learning
Peter Prettenhofer,Benno Stein +1 more
- pp 1118-1127
TLDR
A new approach to cross-language text classification that builds on structural correspondence learning, a recently proposed theory for domain adaptation, is presented, using unlabeled documents, along with a simple word translation oracle, in order to induce task-specific, cross-lingual word correspondences.Abstract:
We present a new approach to cross-language text classification that builds on structural correspondence learning, a recently proposed theory for domain adaptation. The approach uses unlabeled documents, along with a simple word translation oracle, in order to induce task-specific, cross-lingual word correspondences. We report on analyses that reveal quantitative insights about the use of unlabeled data and the complexity of inter-language correspondence modeling.
We conduct experiments in the field of cross-language sentiment classification, employing English as source language, and German, French, and Japanese as target languages. The results are convincing; they demonstrate both the robustness and the competitiveness of the presented ideas.read more
Citations
More filters
Journal ArticleDOI
A survey of transfer learning
TL;DR: This survey paper formally defines transfer learning, presents information on current solutions, and reviews applications applied toTransfer learning, which can be applied to big data environments.
Posted Content
Domain Adaptation for Visual Applications: A Comprehensive Survey
TL;DR: An overview of domain adaptation and transfer learning with a specific view on visual applications and the methods that go beyond image categorization, such as object detection or image segmentation, video analyses or learning visual attributes are overviewed.
Journal ArticleDOI
Learning With Augmented Features for Supervised and Semi-Supervised Heterogeneous Domain Adaptation
TL;DR: This paper proposes a novel method called Heterogeneous Feature Augmentation (HFA) based on SVM which can simultaneously learn the target classifier as well as infer the labels of unlabeled target samples and shows that the SHFA and HFA outperform the existing HDA methods.
Posted Content
Learning with Augmented Features for Heterogeneous Domain Adaptation
Lixin Duan,Dong Xu,Ivor W. Tsang +2 more
TL;DR: A new learning method for heterogeneous domain adaptation (HDA), in which the data from the source domain and the target domain are represented by heterogeneous features with different dimensions, and it is demonstrated that HFA outperforms the existing HDA methods.
Proceedings Article
Multi-View Clustering and Feature Learning via Structured Sparsity
Hua Wang,Feiping Nie,Heng Huang +2 more
TL;DR: A novel multi-view learning model to integrate all features and learn the weight for every feature with respect to each cluster individually via new joint structured sparsity-inducing norms is proposed.
References
More filters
Proceedings ArticleDOI
Cross Language Text Categorization by Acquiring Multilingual Domain Models from Comparable Corpora
Alfio Gliozzo,Carlo Strapparava +1 more
TL;DR: This paper proposes a novel approach to solve the cross language text categorization problem based on acquiring Multilingual Domain Models from comparable corpora in a totally unsupervised way and without using any external knowledge source.
Proceedings ArticleDOI
Domain Adaptation in Sentiment Classification
TL;DR: One of the most challenging problems in natural language processing: domain adaptation in sentiment classification is analysed by looking for generic features by making use of linguistic patterns as an alternative to the commonly feature vectors based on ngrams.