scispace - formally typeset
Journal ArticleDOI

Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review

TLDR
All of the methods in this work can fail to detect the sorts of convergence failure that they were designed to identify, so a combination of strategies aimed at evaluating and accelerating MCMC sampler convergence are recommended.
Abstract
A critical issue for users of Markov chain Monte Carlo (MCMC) methods in applications is how to determine when it is safe to stop sampling and use the samples to estimate characteristics of the distribution of interest. Research into methods of computing theoretical convergence bounds holds promise for the future but to date has yielded relatively little of practical use in applied work. Consequently, most MCMC users address the convergence problem by applying diagnostic tools to the output produced by running their samplers. After giving a brief overview of the area, we provide an expository review of 13 convergence diagnostics, describing the theoretical basis and practical implementation of each. We then compare their performance in two simple models and conclude that all of the methods can fail to detect the sorts of convergence failure that they were designed to identify. We thus recommend a combination of strategies aimed at evaluating and accelerating MCMC sampler convergence, including ap...

read more

Citations
More filters
Journal ArticleDOI

Bayesian dynamic models for survival data with a cure fraction.

TL;DR: Dynamic models for piecewise hazard functions over a finite partition of the time axis provide a great flexibility in controlling the degree of parametricity in the right tail of the survival distribution and the amount of correlations among the log-baseline hazard levels.
Journal ArticleDOI

Laplace Importance Sampling for Generalized Linear Mixed Models

TL;DR: In this article, an importance sampling method is proposed where the importance function is chosen with the aid of Laplace expansion, and it is shown that the accuracy of the standard Laplace approximation may be invalid if the dimension of the integral increases with the sample size and the resulting parameter estimates, especially those of the variance components, are biased towards zero.
Journal ArticleDOI

What are the advantages of MCMC based inference in latent variable models

TL;DR: This paper illustrates the computational advantages of Bayesian estimation using MCMC in several popular latent variable models and implies that Bayesian parameter estimation is faster than classical maximum likelihood estimation.
Journal ArticleDOI

Acoustic emission Bayesian source location: Onset time challenge

TL;DR: An inverse source location problem in a concrete block is considered to address the mentioned issue and an innovative approach to select the most probable onset time obtained from two automatic picker methods is proposed.
Posted Content

Optimal Thinning of MCMC Output

TL;DR: A novel method is proposed, based on greedy minimisation of a kernel Stein discrepancy, that is suitable for problems where heavy compression is required and its effectiveness is demonstrated in the challenging context of parameter inference for ordinary differential equations.
References
More filters
Journal ArticleDOI

Equation of state calculations by fast computing machines

TL;DR: In this article, a modified Monte Carlo integration over configuration space is used to investigate the properties of a two-dimensional rigid-sphere system with a set of interacting individual molecules, and the results are compared to free volume equations of state and a four-term virial coefficient expansion.
Journal ArticleDOI

Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images

TL;DR: The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.
Journal ArticleDOI

Monte Carlo Sampling Methods Using Markov Chains and Their Applications

TL;DR: A generalization of the sampling method introduced by Metropolis et al. as mentioned in this paper is presented along with an exposition of the relevant theory, techniques of application and methods and difficulties of assessing the error in Monte Carlo estimates.
Journal ArticleDOI

Inference from Iterative Simulation Using Multiple Sequences

TL;DR: The focus is on applied inference for Bayesian posterior distributions in real problems, which often tend toward normal- ity after transformations and marginalization, and the results are derived as normal-theory approximations to exact Bayesian inference, conditional on the observed simulations.
Journal ArticleDOI

Robust Locally Weighted Regression and Smoothing Scatterplots

TL;DR: Robust locally weighted regression as discussed by the authors is a method for smoothing a scatterplot, in which the fitted value at z k is the value of a polynomial fit to the data using weighted least squares, where the weight for (x i, y i ) is large if x i is close to x k and small if it is not.
Related Papers (5)