scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Latent dirichlet allocation

TL;DR: This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Abstract: We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: Three studies use location and communication sensors to model individual behaviors and symptoms, long-term health outcomes, and the diffusion of opinions in a community because the underlying sensing technologies are now commonplace and readily available.
Abstract: Mobile phones are a pervasive platform for opportunistic sensing of behaviors and opinions. Three studies use location and communication sensors to model individual behaviors and symptoms, long-term health outcomes, and the diffusion of opinions in a community. These three analyses illustrate how mobile phones can unobtrusively monitor rich social interactions, because the underlying sensing technologies are now commonplace and readily available.

237 citations


Cites background or methods from "Latent dirichlet allocation"

  • ...We propose a method for activity modeling based on the Latent Dirichlet Allocation (LDA) [1] topic model to contrast the activities of participants that change opinions with those that do not....

    [...]

  • ...Topics, discovered by Latent Dirichlet Allocation [1], are essentially clusters of dominating...

    [...]

  • ...It is important to understand what influences opinion change– is there an underlying mechanism resulting in the opinion change for some people? Can we measure this mechanism, and if so, can we predict future opinion changes from observed behavior? We propose a method for activity modeling based on the Latent Dirichlet Allocation (LDA) [1] topic model to contrast the activities of participants that change opinions with those that do not....

    [...]

  • ...Topics, discovered by Latent Dirichlet Allocation [1], are essentially clusters of dominating ‘opinion exposures’ present over all individuals and days in the real-life data collection and described in terms of MME features....

    [...]

Proceedings ArticleDOI
17 Jun 2007
TL;DR: Combining spatial and aspect models significantly improves the region-level classification accuracy, and models trained with image-level labels outperform PLSA trained with pixel-level ones.
Abstract: Considerable advances have been made in learning to recognize and localize visual object classes. Simple bag-of-feature approaches label each pixel or patch independently. More advanced models attempt to improve the coherence of the labellings by introducing some form of inter-patch coupling: traditional spatial models such as MRF's provide crisper local labellings by exploiting neighbourhood-level couplings, while aspect models such as PLSA and LDA use global relevance estimates (global mixing proportions for the classes appearing in the image) to shape the local choices. We point out that the two approaches are complementary, combining them to produce aspect-based spatial field models that outperform both approaches. We study two spatial models: one based on averaging over forests of minimal spanning trees linking neighboring image regions, the other on an efficient chain-based Expectation Propagation method for regular 8-neighbor Markov random fields. The models can be trained using either patch-level labels or image-level keywords. As input features they use factored observation models combining texture, color and position cues. Experimental results on the MSR Cambridge data sets show that combining spatial and aspect models significantly improves the region-level classification accuracy. In fact our models trained with image-level labels outperform PLSA trained with pixel-level ones.

237 citations


Cites background from "Latent dirichlet allocation"

  • ...Unlike multi-modal LDA [4], in our model the three modalities of each patch share a single common topic....

    [...]

  • ...LDA provides additional regularization by encouraging the topic mixtures to be sparse and by averaging over their weights, but this only makes a significant difference for small documents and many topics, Nd 6 T ....

    [...]

  • ...Notably, LDA adds a sparse (Dirichlet) prior for the topic weights θd and treats these as hidden variables to be integrated out rather than as parameters to be estimated using Maximum Likelihood for each document....

    [...]

  • ...Aspect models such as PLSA and LDA ignore the spatial structure of the image, modeling its patches as independent draws from the topic mixture θd....

    [...]

  • ...Aspect models such as PLSA and LDA are probabilistic models that are well suited to this situation....

    [...]

Posted Content
TL;DR: Stochastic regeneration linear runtime scaling in cases where many previous approaches scaled quadratically is shown, and how to use stochastic regeneration and the SPI to implement general-purpose inference strategies such as Metropolis-Hastings, Gibbs sampling, and blocked proposals based on particle Markov chain Monte Carlo and mean-field variational inference techniques are shown.
Abstract: We describe Venture, an interactive virtual machine for probabilistic programming that aims to be sufficiently expressive, extensible, and efficient for general-purpose use. Like Church, probabilistic models and inference problems in Venture are specified via a Turing-complete, higher-order probabilistic language descended from Lisp. Unlike Church, Venture also provides a compositional language for custom inference strategies built out of scalable exact and approximate techniques. We also describe four key aspects of Venture's implementation that build on ideas from probabilistic graphical models. First, we describe the stochastic procedure interface (SPI) that specifies and encapsulates primitive random variables. The SPI supports custom control flow, higher-order probabilistic procedures, partially exchangeable sequences and ``likelihood-free'' stochastic simulators. It also supports external models that do inference over latent variables hidden from Venture. Second, we describe probabilistic execution traces (PETs), which represent execution histories of Venture programs. PETs capture conditional dependencies, existential dependencies and exchangeable coupling. Third, we describe partitions of execution histories called scaffolds that factor global inference problems into coherent sub-problems. Finally, we describe a family of stochastic regeneration algorithms for efficiently modifying PET fragments contained within scaffolds. Stochastic regeneration linear runtime scaling in cases where many previous approaches scaled quadratically. We show how to use stochastic regeneration and the SPI to implement general-purpose inference strategies such as Metropolis-Hastings, Gibbs sampling, and blocked proposals based on particle Markov chain Monte Carlo and mean-field variational inference techniques.

237 citations


Cites methods from "Latent dirichlet allocation"

  • ...ications of several advanced modeling and inference techniques; examples include generative probabilistic graphics programming (Mansinghka*, Kulkarni*, Perov, and Tenenbaum, 2013) and topic modeling (Blei et al., 2003). A description of these and other applications is beyond the scope of this paper. 2.9.1 HIDDEN MARKOV MODELS To represent a Hidden Markov model in Venture, one can use a stochastic recursion to captu...

    [...]

  • ...Venture has been also used to implement applications of several advanced modeling and inference techniques; examples include generative probabilistic graphics programming (Mansinghka*, Kulkarni*, Perov, and Tenenbaum, 2013) and topic modeling (Blei et al., 2003)....

    [...]

Proceedings Article
01 Dec 2009
TL;DR: Two related RaoBlackwellized online inference algorithms for the latent Dirichlet allocation (LDA) model – incremental Gibbs samplers and particle filters – are introduced and their runtime and performance are compared to that of existing algorithms.
Abstract: Inference algorithms for topic models are typically designed to be run over an entire collection of documents after they have been observed. However, in many applications of these models, the collection grows over time, making it infeasible to run batch algorithms repeatedly. This problem can be addressed by using online algorithms, which update estimates of the topics as each document is observed. We introduce two related RaoBlackwellized online inference algorithms for the latent Dirichlet allocation (LDA) model – incremental Gibbs samplers and particle filters – and compare their runtime and performance to that of existing algorithms.

237 citations


Cites methods from "Latent dirichlet allocation"

  • ...Various algorithms have been proposed for solving this problem, including a variational Expectation-Maximization algorithm (Blei et al., 2003) and Expectation-Propagation (Minka and Lafferty, 2002)....

    [...]

  • ...We discuss algorithms for a particular topic model: latent Dirichlet allocation (LDA) (Blei et al., 2003)....

    [...]

  • ...Latent Dirichlet allocation (Blei et al., 2003) is widely used for identifying the topics in a set of documents, building on previous work by Hofmann (1999)....

    [...]

  • ...A number of algorithms exist for solving this problem (e.g., Hofmann, 1999; Blei et al., 2003; Minka and Lafferty, 2002; Griffiths and Steyvers, 2004), most of which are intended to be run in “batch” mode, being applied to all the documents once they are collected....

    [...]

Journal ArticleDOI
TL;DR: Public opinion in the early stages of COVID-19 in China is explored by analyzing Sina-Weibo texts in terms of space, time, and content to better understand the public opinion and sentiments towards CO VID-19, to accelerate emergency responses, and to support post-disaster management.
Abstract: The outbreak of Corona Virus Disease 2019 (COVID-19) is a grave global public health emergency. Nowadays, social media has become the main channel through which the public can obtain information and express their opinions and feelings. This study explored public opinion in the early stages of COVID-19 in China by analyzing Sina-Weibo (a Twitter-like microblogging system in China) texts in terms of space, time, and content. Temporal changes within one-hour intervals and the spatial distribution of COVID-19-related Weibo texts were analyzed. Based on the latent Dirichlet allocation model and the random forest algorithm, a topic extraction and classification model was developed to hierarchically identify seven COVID-19-relevant topics and 13 sub-topics from Weibo texts. The results indicate that the number of Weibo texts varied over time for different topics and sub-topics corresponding with the different developmental stages of the event. The spatial distribution of COVID-19-relevant Weibo was mainly concentrated in Wuhan, Beijing-Tianjin-Hebei, the Yangtze River Delta, the Pearl River Delta, and the Chengdu-Chongqing urban agglomeration. There is a synchronization between frequent daily discussions on Weibo and the trend of the COVID-19 outbreak in the real world. Public response is very sensitive to the epidemic and significant social events, especially in urban agglomerations with convenient transportation and a large population. The timely dissemination and updating of epidemic-related information and the popularization of such information by the government can contribute to stabilizing public sentiments. However, the surge of public demand and the hysteresis of social support demonstrated that the allocation of medical resources was under enormous pressure in the early stage of the epidemic. It is suggested that the government should strengthen the response in terms of public opinion and epidemic prevention and exert control in key epidemic areas, urban agglomerations, and transboundary areas at the province level. In controlling the crisis, accurate response countermeasures should be formulated following public help demands. The findings can help government and emergency agencies to better understand the public opinion and sentiments towards COVID-19, to accelerate emergency responses, and to support post-disaster management.

237 citations


Cites background from "Latent dirichlet allocation"

  • ...In LDA, documents are represented as random mixtures of latent topics, each of which is characterized by a distribution of words [23]....

    [...]

  • ...The number of classification trees (n estimators) was an important parameter for classification accuracy [23]....

    [...]

References
More filters
Book
01 Jan 1995
TL;DR: Detailed notes on Bayesian Computation Basics of Markov Chain Simulation, Regression Models, and Asymptotic Theorems are provided.
Abstract: FUNDAMENTALS OF BAYESIAN INFERENCE Probability and Inference Single-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian Approaches Hierarchical Models FUNDAMENTALS OF BAYESIAN DATA ANALYSIS Model Checking Evaluating, Comparing, and Expanding Models Modeling Accounting for Data Collection Decision Analysis ADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional Approximations REGRESSION MODELS Introduction to Regression Models Hierarchical Linear Models Generalized Linear Models Models for Robust Inference Models for Missing Data NONLINEAR AND NONPARAMETRIC MODELS Parametric Nonlinear Models Basic Function Models Gaussian Process Models Finite Mixture Models Dirichlet Process Models APPENDICES A: Standard Probability Distributions B: Outline of Proofs of Asymptotic Theorems C: Computation in R and Stan Bibliographic Notes and Exercises appear at the end of each chapter.

16,079 citations


"Latent dirichlet allocation" refers background in this paper

  • ...Finally, Griffiths and Steyvers (2002) have presented a Markov chain Monte Carlo algorithm for LDA....

    [...]

  • ...Structures similar to that shown in Figure 1 are often studied in Bayesian statistical modeling, where they are referred to ashierarchical models(Gelman et al., 1995), or more precisely asconditionally independent hierarchical models(Kass and Steffey, 1989)....

    [...]

  • ...Structures similar to that shown in Figure 1 are often studied in Bayesian statistical modeling, where they are referred to as hierarchical models (Gelman et al., 1995), or more precisely as conditionally independent hierarchical models (Kass and Steffey, 1989)....

    [...]

Journal ArticleDOI
TL;DR: A new method for automatic indexing and retrieval to take advantage of implicit higher-order structure in the association of terms with documents (“semantic structure”) in order to improve the detection of relevant documents on the basis of terms found in queries.
Abstract: A new method for automatic indexing and retrieval is described. The approach is to take advantage of implicit higher-order structure in the association of terms with documents (“semantic structure”) in order to improve the detection of relevant documents on the basis of terms found in queries. The particular technique used is singular-value decomposition, in which a large term by document matrix is decomposed into a set of ca. 100 orthogonal factors from which the original matrix can be approximated by linear combination. Documents are represented by ca. 100 item vectors of factor weights. Queries are represented as pseudo-document vectors formed from weighted combinations of terms, and documents with supra-threshold cosine values are returned. initial tests find this completely automatic method for retrieval to be promising.

12,443 citations


"Latent dirichlet allocation" refers methods in this paper

  • ...To address these shortcomings, IR researchers have proposed several other dimensionality reduction techniques, most notably latent semantic indexing (LSI) (Deerwester et al., 1990)....

    [...]

  • ...To address these shortcomings, IR researchers have proposed several other dimensionality reduction techniques, most notablylatent semantic indexing (LSI)(Deerwester et al., 1990)....

    [...]

Book
01 Jan 1983
TL;DR: Reading is a need and a hobby at once and this condition is the on that will make you feel that you must read.
Abstract: Some people may be laughing when looking at you reading in your spare time. Some may be admired of you. And some may want be like you who have reading hobby. What about your own feel? Have you felt right? Reading is a need and a hobby at once. This condition is the on that will make you feel that you must read. If you know are looking for the book enPDFd introduction to modern information retrieval as the choice of reading, you can find here.

12,059 citations


"Latent dirichlet allocation" refers background or methods in this paper

  • ...In the populartf-idf scheme (Salton and McGill, 1983), a basic vocabulary of “words” or “terms” is chosen, and, for each document in the corpus, a count is formed of the number of occurrences of each word....

    [...]

  • ...We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model....

    [...]

Book
01 Jan 1939
TL;DR: In this paper, the authors introduce the concept of direct probabilities, approximate methods and simplifications, and significant importance tests for various complications, including one new parameter, and various complications for frequency definitions and direct methods.
Abstract: 1. Fundamental notions 2. Direct probabilities 3. Estimation problems 4. Approximate methods and simplifications 5. Significance tests: one new parameter 6. Significance tests: various complications 7. Frequency definitions and direct methods 8. General questions

7,086 citations