scispace - formally typeset
Open AccessJournal ArticleDOI

Latent dirichlet allocation

Reads0
Chats0
TLDR
This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Abstract
We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings Article

Privacy for Free: Posterior Sampling and Stochastic Gradient Monte Carlo

TL;DR: It is shown that under standard assumptions, getting one sample from a posterior distribution is differentially private "for free"; and this sample as a statistical estimator is often consistent, near optimal, and computationally tractable; and this observations lead to an "anytime" algorithm for Bayesian learning under privacy constraint.
Proceedings Article

Non-linear Metric Learning

TL;DR: These methods not only match the current state-of-the-art in terms of kNN classification error, but in the case of Χ2-LMNN, obtain best results in 19 out of 20 learning settings.
Journal ArticleDOI

Studying User Income through Language, Behaviour and Affect in Social Media

TL;DR: This paper presents the first extensive study where user behaviour on Twitter is used to build a predictive model of income, and applies non-linear methods for regression, i.e. Gaussian Processes, achieving strong correlation between predicted and actual user income.
Proceedings ArticleDOI

Unfolding physiological state: mortality modelling in intensive care units

TL;DR: This work examined the use of latent variable models to decompose free-text hospital notes into meaningful features, and found that latent topic-derived features were effective in determining patient mortality under three timelines: in-hospital, 30 day post- Discharge, and 1 year post-discharge mortality.
Proceedings Article

Visualizing Topic Models

TL;DR: This paper creates a navigator of the documents, allowing users to explore the hidden structure that a topic model discovers, and reveals meaningful patterns in a collection, helping end-users explore and understand its contents in new ways.
References
More filters
Book

Bayesian Data Analysis

TL;DR: Detailed notes on Bayesian Computation Basics of Markov Chain Simulation, Regression Models, and Asymptotic Theorems are provided.
Journal ArticleDOI

Indexing by Latent Semantic Analysis

TL;DR: A new method for automatic indexing and retrieval to take advantage of implicit higher-order structure in the association of terms with documents (“semantic structure”) in order to improve the detection of relevant documents on the basis of terms found in queries.
Book

Introduction to Modern Information Retrieval

TL;DR: Reading is a need and a hobby at once and this condition is the on that will make you feel that you must read.
Book

Theory of probability

TL;DR: In this paper, the authors introduce the concept of direct probabilities, approximate methods and simplifications, and significant importance tests for various complications, including one new parameter, and various complications for frequency definitions and direct methods.
Related Papers (5)