scispace - formally typeset
Open AccessJournal ArticleDOI

Latent dirichlet allocation

TLDR
This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Abstract
We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal Article

Distributed Algorithms for Topic Models

TL;DR: This work describes distributed algorithms for two widely-used topic models, namely the Latent Dirichlet Allocation (LDA) model and the Hierarchical Dirichet Process (HDP) model, and proposes a model that uses a hierarchical Bayesian extension of LDA to directly account for distributed data.
Proceedings ArticleDOI

Learning Gaussian processes from multiple tasks

TL;DR: This work considers the problem of multi-task learning, that is, learning multiple related functions, and presents a hierarchical Bayesian framework, that exploits the equivalence between parametric linear models and nonparametric Gaussian processes.
Proceedings ArticleDOI

Incorporating domain knowledge into topic modeling via Dirichlet Forest priors

TL;DR: This work incorporates domain knowledge about the composition of words that should have high or low probability in various topics using a novel Dirichlet Forest prior in a LatentDirichlet Allocation framework.
Proceedings ArticleDOI

Representation Learning Using Multi-Task Deep Neural Networks for Semantic Classification and Information Retrieval

TL;DR: This work develops a multi-task DNN for learning representations across multiple tasks, not only leveraging large amounts of cross-task data, but also benefiting from a regularization effect that leads to more general representations to help tasks in new domains.
Proceedings ArticleDOI

TriRank: Review-aware Explainable Recommendation by Modeling Aspects

TL;DR: TriRank endows the recommender system with a higher degree of explainability and transparency by modeling aspects in reviews, and allows users to interact with the system through their aspect preferences, assisting users in making informed decisions.
References
More filters
Book

Bayesian Data Analysis

TL;DR: Detailed notes on Bayesian Computation Basics of Markov Chain Simulation, Regression Models, and Asymptotic Theorems are provided.
Journal ArticleDOI

Indexing by Latent Semantic Analysis

TL;DR: A new method for automatic indexing and retrieval to take advantage of implicit higher-order structure in the association of terms with documents (“semantic structure”) in order to improve the detection of relevant documents on the basis of terms found in queries.
Book

Introduction to Modern Information Retrieval

TL;DR: Reading is a need and a hobby at once and this condition is the on that will make you feel that you must read.
Book

Theory of probability

TL;DR: In this paper, the authors introduce the concept of direct probabilities, approximate methods and simplifications, and significant importance tests for various complications, including one new parameter, and various complications for frequency definitions and direct methods.
Related Papers (5)