scispace - formally typeset
Journal ArticleDOI

Approximate Dirichlet process computing in finite normal mixtures: Smoothing and prior information

TLDR
In this article, a nonparametric analysis of the finite normal mixture model is obtained by working with a precise truncation approximation of the Dirichlet process, which is carried out by a simple Gibbs sampling algorithm that directly samples the non-parametric posterior.
Abstract
A rich nonparametric analysis of the finite normal mixture model is obtained by working with a precise truncation approximation of the Dirichlet process. Model fitting is carried out by a simple Gibbs sampling algorithm that directly samples the nonparametric posterior. The proposed sampler mixes well, requires no tuning parameters, and involves only draws from simple distributions, including the draw for the mass parameter that controls clustering, and the draw for the variances with the use of a nonconjugate uniform prior. Working directly with the nonparametric prior is conceptually appealing and among other things leads to graphical methods for studying the posterior mixing distribution as well as penalized MLE procedures for deriving point estimates. We discuss methods for automating selection of priors for the mean and variance components to avoid over or undersmoothing the data. We also look at the effectiveness of incorporating prior information in the form of frequentist point estimates.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Gibbs sampling methods for stick-breaking priors

TL;DR: Two general types of Gibbs samplers that can be used to fit posteriors of Bayesian hierarchical models based on stick-breaking priors are presented and the blocked Gibbs sampler, based on an entirely different approach that works by directly sampling values from the posterior of the random measure.
Journal ArticleDOI

Nonparametric Bayesian Data Analysis

TL;DR: For each inference problem, relevant nonparametric Bayesian models and approaches including Dirichlet process models and variations, Polya trees, wavelet based models, neural network models, spline regression, CART, dependent DP models and model validation with DP and Polya tree extensions of parametric models are reviewed.
Journal Article

Nonparametric Bayesian data analysis

TL;DR: In this paper, the current state of nonparametric Bayesian inference is reviewed and a list of important statistical inference problems, including density estimation, regression, survival analysis, hierarchical models and model validation are discussed.
Journal ArticleDOI

Exact and approximate sum representations for the Dirichlet process

TL;DR: The Dirichlet process can be regarded as a random probability measure for which the authors of as discussed by the authors examined various sum representations and proposed an approximation for it in the context of Bayesian nonpara-metric hierarchical models.
Journal ArticleDOI

The Nested Dirichlet Process

TL;DR: In this article, the problem of nonparametric modeling of these distributions, borrowing information across centers while also allowing centers to be clustered is addressed, and an efficient Markov chain Monte Carlo algorithm is developed for computation.
References
More filters
Journal ArticleDOI

Estimating the Dimension of a Model

TL;DR: In this paper, the problem of selecting one of a number of models of different dimensions is treated by finding its Bayes solution, and evaluating the leading terms of its asymptotic expansion.
Book

An introduction to the bootstrap

TL;DR: This article presents bootstrap methods for estimation, using simple arguments, with Minitab macros for implementing these methods, as well as some examples of how these methods could be used for estimation purposes.

Estimating the dimension of a model

TL;DR: In this paper, the problem of selecting one of a number of models of different dimensions is treated by finding its Bayes solution, and evaluating the leading terms of its asymptotic expansion.
Proceedings Article

Information Theory and an Extention of the Maximum Likelihood Principle

H. Akaike
TL;DR: The classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion to provide answers to many practical problems of statistical model fitting.
Book ChapterDOI

Information Theory and an Extension of the Maximum Likelihood Principle

TL;DR: In this paper, it is shown that the classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion.