scispace - formally typeset
Search or ask a question
Topic

Probabilistic latent semantic analysis

About: Probabilistic latent semantic analysis is a research topic. Over the lifetime, 2884 publications have been published within this topic receiving 198341 citations. The topic is also known as: PLSA.


Papers
More filters
Book ChapterDOI
18 Jun 2011
TL;DR: This paper proposes a new method, based on LSA and support vector machine (SVM), to improve the sentiment classification performance and shows that the introduction of SVM outperforms the other experiment in precision of the polarity analysis.
Abstract: Sentiment classification has attracted increasing interest from natural language processing. The goal of sentiment classification is to automatically identify whether a given piece of text expresses positive or negative opinion on a topic of interest. Latent semantic analysis (LSA) has been shown to be extremely useful in information retrieval. In this paper, we propose a new method, based on LSA and support vector machine (SVM) to improve the sentiment classification performance. This method takes the advantage of both LSA and SVM. During the training process, SVM makes use of its excellent classification ability to conduct the sentiment classification first. To show that our method is feasible and effective, we designed two experiments and the experimental result shows that the introduction of SVM outperforms the other experiment in precision of the polarity analysis.

27 citations

Proceedings Article
01 Apr 2006
TL;DR: This paper presents a method for using LSA analysis to initialize a PLSA model, and investigates the performance of the method for the tasks of text segmentation and retrieval on personal-size corpora.
Abstract: Probabilistic Latent Semantic Analysis (PLSA) models have been shown to provide a better model for capturing polysemy and synonymy than Latent Semantic Analysis (LSA) However, the parameters of a PLSA model are trained using the Expectation Maximization (EM) algorithm, and as a result, the trained model is dependent on the initialization values so that performance can be highly variable In this paper we present a method for using LSA analysis to initialize a PLSA model We also investigated the performance of our method for the tasks of text segmentation and retrieval on personal-size corpora, and present results demonstrating the efficacy of our proposed approach

27 citations

Journal ArticleDOI
TL;DR: This work proposes a new semi-supervised GP-LVM framework under the pairwise constraints, which allows for constrained priori information on the latent variables under the maximum a posteriori (MAP) algorithm.

27 citations

Book
01 Jan 1970
TL;DR: The methods presented include not only those that are the most useful in practice, but some chosen because of the interesting ideas they present and the help they give to a general understanding of the latent root and vector problem.
Abstract: This book is concerned with the latent roots and latent vectors of matrices whose elements are real. Its main purpose is to discuss some of the methods available for finding latent roots and vectors. The methods presented include not only those that are the most useful in practice, but some chosen because of the interesting ideas they present and the help they give to a general understanding of the latent root and vector problem. I have attempted throughout to introduce the material in as uncomplicated a manner as possible.

27 citations


Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
84% related
Feature (computer vision)
128.2K papers, 1.7M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
84% related
Deep learning
79.8K papers, 2.1M citations
83% related
Object detection
46.1K papers, 1.3M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202319
202277
202114
202036
201927
201858