ReportDOI
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation
Yee Whye Teh,David Newman,Max Welling +2 more
- Vol. 19, pp 1353-1360
Reads0
Chats0
TLDR
This paper proposes the collapsed variational Bayesian inference algorithm for LDA, and shows that it is computationally efficient, easy to implement and significantly more accurate than standard variationalBayesian inference for L DA.Abstract:
Latent Dirichlet allocation (LDA) is a Bayesian network that has recently gained much popularity in applications ranging from document modeling to computer vision. Due to the large scale nature of these applications, current inference procedures like variational Bayes and Gibbs sampling have been found lacking. In this paper we propose the collapsed variational Bayesian inference algorithm for LDA, and show that it is computationally efficient, easy to implement and significantly more accurate than standard variational Bayesian inference for LDA.read more
Citations
More filters
Proceedings ArticleDOI
Dynamic topic models
David M. Blei,John Lafferty +1 more
TL;DR: A family of probabilistic time series models is developed to analyze the time evolution of topics in large document collections, and dynamic topic models provide a qualitative window into the contents of a large document collection.
Journal ArticleDOI
Stochastic variational inference
TL;DR: Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart.
Journal ArticleDOI
Mixed Membership Stochastic Blockmodels
TL;DR: In this article, the authors introduce a class of variance allocation models for pairwise measurements, called mixed membership stochastic blockmodels, which combine global parameters that instantiate dense patches of connectivity (blockmodel) with local parameters (mixed membership), and develop a general variational inference algorithm for fast approximate posterior inference.
Posted Content
Mixed membership stochastic blockmodels
TL;DR: The mixed membership stochastic block model as discussed by the authors extends block models for relational data to ones which capture mixed membership latent relational structure, thus providing an object-specific low-dimensional representation.
Book
Bayesian Reasoning and Machine Learning
TL;DR: Comprehensive and coherent, this hands-on text develops everything from basic reasoning to advanced techniques within the framework of graphical models, and develops analytical and problem-solving skills that equip them for the real world.
References
More filters
Journal ArticleDOI
Latent dirichlet allocation
TL;DR: This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Proceedings Article
Latent Dirichlet Allocation
TL;DR: This paper proposed a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hof-mann's aspect model, also known as probabilistic latent semantic indexing (pLSI).
Journal ArticleDOI
Finding scientific topics
TL;DR: A generative model for documents is described, introduced by Blei, Ng, and Jordan, and a Markov chain Monte Carlo algorithm is presented for inference in this model, which is used to analyze abstracts from PNAS by using Bayesian model selection to establish the number of topics.
Proceedings ArticleDOI
A Bayesian hierarchical model for learning natural scene categories
Li Fei-Fei,Pietro Perona +1 more
TL;DR: This work proposes a novel approach to learn and recognize natural scene categories by representing the image of a scene by a collection of local regions, denoted as codewords obtained by unsupervised learning.
Dissertation
Variational Algorithms for Approximate Bayesian Inference
TL;DR: A unified variational Bayesian (VB) framework which approximates computations in models with latent variables using a lower bound on the marginal likelihood and is compared to other methods including sampling, Cheeseman-Stutz, and asymptotic approximations such as BIC.