Modeling individual differences using Dirichlet processes
Reads0
Chats0
TLDR
A Bayesian framework for modeling individual differences, in which subjects are assumed to belong to one of a potentially infinite number of groups, is introduced, allowing us to learn flexible parameter distributions without overfitting the data, or requiring the complex computations typically required for determining the dimensionality of a model.About:
This article is published in Journal of Mathematical Psychology.The article was published on 2006-04-01 and is currently open access. It has received 147 citations till now. The article focuses on the topics: Hierarchical Dirichlet process & Latent Dirichlet allocation.read more
Citations
More filters
Journal ArticleDOI
Bayesian data analysis.
TL;DR: A fatal flaw of NHST is reviewed and some benefits of Bayesian data analysis are introduced and illustrative examples of multiple comparisons in Bayesian analysis of variance and Bayesian approaches to statistical power are presented.
Parameter estimation for text analysis
TL;DR: Presents parameter estimation methods common with discrete proba- bility distributions, which is of particular interest in text modeling, and central concepts like conjugate distributions and Bayesian networks are reviewed.
Journal ArticleDOI
Précis of Bayesian Rationality: The Probabilistic Approach to Human Reasoning
Mike Oaksford,Nick Chater +1 more
TL;DR: The case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems, and the wider “probabilistic turn” in cognitive science and artificial intelligence is considered.
Journal ArticleDOI
A Tutorial on Bayesian Nonparametric Models
Samuel J. Gershman,David M. Blei +1 more
TL;DR: This tutorial is a high-level introduction to Bayesian nonparametric methods and contains several examples of their application.
Journal ArticleDOI
Monte Carlo Methods in Bayesian Computation
TL;DR: The authors use the setting of singular perturbations, which allows them to study both weak and strong interactions among the states of the chain and give the asymptotic behavior of many controlled stochastic dynamic systems when the perturbation parameter tends to 0.
References
More filters
Journal ArticleDOI
Latent dirichlet allocation
TL;DR: This work proposes a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hofmann's aspect model.
Proceedings Article
Latent Dirichlet Allocation
TL;DR: This paper proposed a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams, and Hof-mann's aspect model, also known as probabilistic latent semantic indexing (pLSI).
Journal ArticleDOI
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
Stuart Geman,Donald Geman +1 more
TL;DR: The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.