scispace - formally typeset
Open AccessPosted Content

Hamiltonian Monte Carlo with Energy Conserving Subsampling

Reads0
Chats0
TLDR
This article shows that efficient subsampling HMC for the parameters is possible if both the dynamics and the acceptance probability are computed from the same data subsample in each complete HMC iteration.
Abstract
Hamiltonian Monte Carlo (HMC) samples efficiently from high-dimensional posterior distributions with proposed parameter draws obtained by iterating on a discretized version of the Hamiltonian dynamics. The iterations make HMC computationally costly, especially in problems with large datasets, since it is necessary to compute posterior densities and their derivatives with respect to the parameters. Naively computing the Hamiltonian dynamics on a subset of the data causes HMC to lose its key ability to generate distant parameter proposals with high acceptance probability. The key insight in our article is that efficient subsampling HMC for the parameters is possible if both the dynamics and the acceptance probability are computed from the same data subsample in each complete HMC iteration. We show that this is possible to do in a principled way in a HMC-within-Gibbs framework where the subsample is updated using a pseudo marginal MH step and the parameters are then updated using an HMC step, based on the current subsample. We show that our subsampling methods are fast and compare favorably to two popular sampling algorithms that utilize gradient estimates from data subsampling. We also explore the current limitations of subsampling HMC algorithms by varying the quality of the variance reducing control variates used in the estimators of the posterior density and its gradients.

read more

Citations
More filters
Journal Article

Riemann manifold Langevin and Hamiltonian Monte Carlo methods

TL;DR: The methodology proposed automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density, and substantial improvements in the time‐normalized effective sample size are reported when compared with alternative sampling approaches.
Journal ArticleDOI

Speeding Up MCMC by Efficient Data Subsampling

TL;DR: Subsampling Markov chain Monte Carlo is substantially more efficient than standard MCMC in terms of sampling efficiency for a given computational budget, and that it outperforms other subsampling methods for MCMC proposed in the literature.
Journal ArticleDOI

Directed Technical Change as a Response to Natural Resource Scarcity

TL;DR: The authors developed a quantitative macroeconomic theory of input-saving technical change to analyze how markets economize on scarce natural resources, with an application to fossil fuel, and found that aggreg...
Journal ArticleDOI

Subsampling MCMC - an Introduction for the Survey Statistician

TL;DR: In this paper, the authors present a survey statistician without previous knowledge of MCMC methods and motivate survey sampling experts to contribute to the growing sub-sampling MCMC literature.
Journal ArticleDOI

Trading volume and liquidity provision in cryptocurrency markets

TL;DR: In this article , the authors provide empirical evidence that the returns from liquidity provision, proxied by the returns of a short-term reversal strategy, are primarily concentrated in trading pairs with lower levels of market activity.
References
More filters
Book

Elements of information theory

TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Journal ArticleDOI

Equation of state calculations by fast computing machines

TL;DR: In this article, a modified Monte Carlo integration over configuration space is used to investigate the properties of a two-dimensional rigid-sphere system with a set of interacting individual molecules, and the results are compared to free volume equations of state and a four-term virial coefficient expansion.
Journal ArticleDOI

Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images

TL;DR: The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.
Journal ArticleDOI

Monte Carlo Sampling Methods Using Markov Chains and Their Applications

TL;DR: A generalization of the sampling method introduced by Metropolis et al. as mentioned in this paper is presented along with an exposition of the relevant theory, techniques of application and methods and difficulties of assessing the error in Monte Carlo estimates.
Journal ArticleDOI

A Stochastic Approximation Method

TL;DR: In this article, a method for making successive experiments at levels x1, x2, ··· in such a way that xn will tend to θ in probability is presented.
Related Papers (5)