Open AccessProceedings Article
Continuous Relaxations for Discrete Hamiltonian Monte Carlo
Yichuan Zhang,Zoubin Ghahramani,Amos Storkey,Charles Sutton +3 more
- Vol. 25, pp 3194-3202
TLDR
It is shown that a general form of the Gaussian Integral Trick makes it possible to transform a wide class of discrete variable undirected models into fully continuous systems, which opens up a number of new avenues for inference in difficult discrete systems.Abstract:
Continuous relaxations play an important role in discrete optimization, but have not seen much use in approximate probabilistic inference. Here we show that a general form of the Gaussian Integral Trick makes it possible to transform a wide class of discrete variable undirected models into fully continuous systems. The continuous representation allows the use of gradient-based Hamiltonian Monte Carlo for inference, results in new ways of estimating normalization constants (partition functions), and in general opens up a number of new avenues for inference in difficult discrete systems. We demonstrate some of these continuous relaxation inference algorithms on a number of illustrative problems.read more
Citations
More filters
Journal ArticleDOI
Neural Network Renormalization Group
Shuo-Hui Li,Lei Wang +1 more
TL;DR: In this paper, a variational renormalization group (RG) approach based on a reversible generative model with hierarchical architecture is proposed, which performs hierarchical change-of-variables transformations from the physical space to a latent space with reduced mutual information.
Journal ArticleDOI
On Russian Roulette Estimates for Bayesian Inference with Doubly-Intractable Likelihoods
TL;DR: A number of Markov chain Monte Carlo (MCMCMC) schemes have been proposed for doubly-intractable posterior distributions as discussed by the authors, which can be applied to all classes of models with doubly intractable posteriors.
Proceedings Article
Generalizing Hamiltonian Monte Carlo with Neural Networks
TL;DR: In this article, a general-purpose method to train Markov chain Monte Carlo kernels, parameterized by deep neural networks, that converge and mix quickly to their target distribution is presented.
Journal ArticleDOI
Informed proposals for local MCMC in discrete spaces
TL;DR: In this article, a lack of methodological results to design efficient Markov chain Monte Carlo (MCMC) algorithms for statistical models with discrete-valued high-dimensional parameters is discussed.
References
More filters
Journal ArticleDOI
Reducing the Dimensionality of Data with Neural Networks
TL;DR: In this article, an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data is described.
Book
Introduction To The Theory Of Neural Computation
TL;DR: This book is a detailed, logically-developed treatment that covers the theory and uses of collective computational networks, including associative memory, feed forward networks, and unsupervised learning.
Book
Bayesian learning for neural networks
TL;DR: Bayesian Learning for Neural Networks shows that Bayesian methods allow complex neural network models to be used without fear of the "overfitting" that can occur with traditional neural network learning methods.
Proceedings ArticleDOI
Incorporating Non-local Information into Information Extraction Systems by Gibbs Sampling
TL;DR: By using simulated annealing in place of Viterbi decoding in sequence models such as HMMs, CMMs, and CRFs, it is possible to incorporate non-local structure while preserving tractable inference.
Journal ArticleDOI
Time‐Dependent Statistics of the Ising Model
TL;DR: In this paper, the effect of a uniform, time-varying magnetic field upon the Ising model is discussed, and the frequency-dependent magnetic susceptibility is found in the weak-field limit.