scispace - formally typeset
Open AccessJournal ArticleDOI

Latent dirichlet allocation

Reads0
Chats0
TLDR
This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Abstract
We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings Article

Max-margin deepwalk: discriminative learning of network representation

TL;DR: MMDW is a unified NRL framework that jointly optimizes the max-margin classifier and the aimed social representation learning model, and indicates that the model is more discriminative than unsupervised ones, and the experimental results demonstrate that the method achieves a significant improvement than other state-of-the-art methods.
Proceedings ArticleDOI

What's going on? Discovering spatio-temporal dependencies in dynamic scenes

TL;DR: Two novel methods to automatically learn spatio-temporal dependencies of moving agents in complex dynamic scenes by employing Dependent Dirichlet Processes to learn an arbitrary number of infinite Hidden Markov Models.
Proceedings ArticleDOI

Transfer Learning in Natural Language Processing.

TL;DR: Transfer learning as discussed by the authors is a set of methods that extend the classical supervised machine learning paradigm by leveraging data from additional domains or tasks to train a model with better generalization properties, which can be used for NLP tasks.
Proceedings ArticleDOI

Social contextual recommendation

TL;DR: This paper investigates social recommendation on the basis of psychology and sociology studies, which exhibit two important factors: individual preference and interpersonal influence, and proposes a novel probabilistic matrix factorization method to fuse them in latent spaces.
Posted Content

Investigating Capsule Networks with Dynamic Routing for Text Classification

TL;DR: This work proposes three strategies to stabilize the dynamic routing process to alleviate the disturbance of some noise capsules which may contain “background” information or have not been successfully trained.
References
More filters
Book

Bayesian Data Analysis

TL;DR: Detailed notes on Bayesian Computation Basics of Markov Chain Simulation, Regression Models, and Asymptotic Theorems are provided.
Journal ArticleDOI

Indexing by Latent Semantic Analysis

TL;DR: A new method for automatic indexing and retrieval to take advantage of implicit higher-order structure in the association of terms with documents (“semantic structure”) in order to improve the detection of relevant documents on the basis of terms found in queries.
Book

Introduction to Modern Information Retrieval

TL;DR: Reading is a need and a hobby at once and this condition is the on that will make you feel that you must read.
Book

Theory of probability

TL;DR: In this paper, the authors introduce the concept of direct probabilities, approximate methods and simplifications, and significant importance tests for various complications, including one new parameter, and various complications for frequency definitions and direct methods.
Related Papers (5)