scispace - formally typeset
Open AccessJournal ArticleDOI

Latent dirichlet allocation

TLDR
This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Abstract
We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings Article

Distributed Inference for Latent Dirichlet Allocation

TL;DR: Using five real-world text corpora, it is shown that distributed learning works very well for LDA models, i.e., perplexity and precision-recall scores for distributed learning are indistinguishable from those obtained with single-processor learning.
Journal ArticleDOI

An overview of topic modeling and its current applications in bioinformatics

TL;DR: Topic modeling is a useful method (in contrast to the traditional means of data reduction in bioinformatics) and enhances researchers’ ability to interpret biological information and the studies on topic modeling in biological data still have a long and challenging road ahead.
Journal ArticleDOI

The Evolution of 10-K Textual Disclosure: Evidence from Latent Dirichlet Allocation

TL;DR: This work uses Latent Dirichlet Allocation (LDA) to examine specific topics and finds that new FASB and SEC requirements explain most of the increase in length and that 3 of the 150 topics—fair value, internal controls, and risk factor disclosures—account for virtually all of the decrease.
MonographDOI

Foundations of Data Science

TL;DR: Computer science as an academic discipline began in the 1960’s with emphasis on programming languages, compilers, operating systems, and the mathematical theory that supported these areas, but today, a fundamental change is taking place and the focus is more on applications.
Journal ArticleDOI

What are mobile developers asking about? A large scale study using stack overflow

TL;DR: This paper uses data from the popular online Q&A site, Stack Overflow, and analyze 13,232,821 posts to examine what mobile developers ask about, and establishes a novel approach for analyzing questions asked onQ&A forums.
References
More filters
Book

Bayesian Data Analysis

TL;DR: Detailed notes on Bayesian Computation Basics of Markov Chain Simulation, Regression Models, and Asymptotic Theorems are provided.
Journal ArticleDOI

Indexing by Latent Semantic Analysis

TL;DR: A new method for automatic indexing and retrieval to take advantage of implicit higher-order structure in the association of terms with documents (“semantic structure”) in order to improve the detection of relevant documents on the basis of terms found in queries.
Book

Introduction to Modern Information Retrieval

TL;DR: Reading is a need and a hobby at once and this condition is the on that will make you feel that you must read.
Book

Theory of probability

TL;DR: In this paper, the authors introduce the concept of direct probabilities, approximate methods and simplifications, and significant importance tests for various complications, including one new parameter, and various complications for frequency definitions and direct methods.
Related Papers (5)