scispace - formally typeset
Open AccessProceedings Article

Cross-Language Text Classification Using Structural Correspondence Learning

TLDR
A new approach to cross-language text classification that builds on structural correspondence learning, a recently proposed theory for domain adaptation, is presented, using unlabeled documents, along with a simple word translation oracle, in order to induce task-specific, cross-lingual word correspondences.
Abstract
We present a new approach to cross-language text classification that builds on structural correspondence learning, a recently proposed theory for domain adaptation. The approach uses unlabeled documents, along with a simple word translation oracle, in order to induce task-specific, cross-lingual word correspondences. We report on analyses that reveal quantitative insights about the use of unlabeled data and the complexity of inter-language correspondence modeling. We conduct experiments in the field of cross-language sentiment classification, employing English as source language, and German, French, and Japanese as target languages. The results are convincing; they demonstrate both the robustness and the competitiveness of the presented ideas.

read more

Citations
More filters
Journal ArticleDOI

A language-model-based approach for subjectivity detection

TL;DR: A subjectivity detection method is proposed which utilizes a language-model-based structure to define a subjectivity score for each document where the topic relevance of documents does not affect the subjectivity scores.
Book ChapterDOI

Transfer Learning Techniques

TL;DR: This survey paper formally defines transfer learning, presents information on current solutions, and reviews applications applied toTransfer learning, which can be applied to big data environments.
Journal ArticleDOI

Cross-lingual sentiment classification

TL;DR: The proposed aligned-translation topic model provides a cross-language representation space in which the semi-supervised training data adjustment procedure attempts to select effective training samples to eliminate the negative influence of the semantic distribution differences between the original and target languages.
Proceedings ArticleDOI

Modeling Language Discrepancy for Cross-Lingual Sentiment Analysis

TL;DR: This paper aims to model the language discrepancy in sentiment expressions as intrinsic bilingual polarity correlations (IBPCs) for better cross-lingual sentiment analysis and demonstrates the superiority of the proposed models against several state-of-the-art alternatives.
Journal ArticleDOI

A Survey of Cross-lingual Sentiment Analysis: Methodologies, Models and Evaluations

TL;DR: Cross-lingual sentiment analysis (CLSA) as discussed by the authors leverages one or several source languages to help the low-resource languages to perform sentiment analysis, thus, the problem of lack of annotated corpora in many non-English languages can be alleviated.
References
More filters
Journal ArticleDOI

Regularization and variable selection via the elastic net

TL;DR: It is shown that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation, and an algorithm called LARS‐EN is proposed for computing elastic net regularization paths efficiently, much like algorithm LARS does for the lamba.

Thumbs up? Sentiment Classiflcation using Machine Learning Techniques

TL;DR: In this paper, the problem of classifying documents not by topic, but by overall sentiment, e.g., determining whether a review is positive or negative, was considered and three machine learning methods (Naive Bayes, maximum entropy classiflcation, and support vector machines) were employed.
Proceedings ArticleDOI

Thumbs up? Sentiment Classification using Machine Learning Techniques

TL;DR: This work considers the problem of classifying documents not by topic, but by overall sentiment, e.g., determining whether a review is positive or negative, and concludes by examining factors that make the sentiment classification problem more challenging.
Proceedings Article

Biographies, Bollywood, Boom-boxes and Blenders: Domain Adaptation for Sentiment Classification

TL;DR: This work extends to sentiment classification the recently-proposed structural correspondence learning (SCL) algorithm, reducing the relative error due to adaptation between domains by an average of 30% over the original SCL algorithm and 46% over a supervised baseline.
Journal ArticleDOI

Pegasos: primal estimated sub-gradient solver for SVM

TL;DR: A simple and effective stochastic sub-gradient descent algorithm for solving the optimization problem cast by Support Vector Machines, which is particularly well suited for large text classification problems, and demonstrates an order-of-magnitude speedup over previous SVM learning methods.