scispace - formally typeset
Open AccessJournal ArticleDOI

Fixed-form variational posterior approximation through stochastic linear regression

Tim Salimans, +1 more
- 01 Dec 2013 - 
- Vol. 8, Iss: 4, pp 837-882
Reads0
Chats0
TLDR
A general algorithm for approximating nonstandard Bayesian posterior distributions that minimizes the Kullback-Leibler divergence of an approximating distribution to the intractable posterior distribu- tion.
Abstract
textWe propose a general algorithm for approximating nonstandard Bayesian posterior distributions. The algorithm minimizes the Kullback-Leibler divergence of an approximating distribution to the intractable posterior distribu- tion. Our method can be used to approximate any posterior distribution, provided that it is given in closed form up to the proportionality constant. The approxi- mation can be any distribution in the exponential family or any mixture of such distributions, which means that it can be made arbitrarily precise. Several exam- ples illustrate the speed and accuracy of our approximation method in practice.

read more

Citations
More filters
Posted Content

Structured Black Box Variational Inference for Latent Time Series Models

TL;DR: A BBVI algorithm analogous to the forward-backward algorithm which instead scales linearly in time is described, which allows us to efficiently sample from the variational distribution and estimate the gradients of the ELBO.
Proceedings ArticleDOI

BeamSeg: A Joint Model for Multi-Document Segmentation and Topic Identification

TL;DR: The model implements lexical cohesion in an unsupervised Bayesian setting by drawing from the same language model segments with the same topic by using a dynamic Dirichlet prior that takes into account data contributions from other topics.
Journal ArticleDOI

Variational Bayes Estimation of Discrete-Margined Copula Models With Application to Time Series

TL;DR: A new variational Bayes (VB) estimator for high-dimensional copulas with discrete, or a combination of discrete and continuous, margins is proposed, based on a variational approximation to a tractable augmented posterior and is faster than previous likelihood-based approaches.
DissertationDOI

Approximate Inference: New Visions

Yingzhen Li
TL;DR: Two new approximate inference algorithms are developed, which provide a unifying view of existing variational methods from different algorithmic perspectives and lead to better calibrated inference results for complex models such as neural network classifiers and deep generative models, and scale to large datasets containing hundreds of thousands of data-points.
Posted Content

Diversifying Sparsity Using Variational Determinantal Point Processes.

TL;DR: A novel diverse feature selection method based on determinantal point processes (DPPs) that enables one to flexibly define diversity based on the covariance of features (similar to orthogonal matching pursuit) or alternatively based on side information.
References
More filters
Book

Pattern Recognition and Machine Learning

TL;DR: Probability Distributions, linear models for Regression, Linear Models for Classification, Neural Networks, Graphical Models, Mixture Models and EM, Sampling Methods, Continuous Latent Variables, Sequential Data are studied.

Pattern Recognition and Machine Learning

TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Journal ArticleDOI

A Stochastic Approximation Method

TL;DR: In this article, a method for making successive experiments at levels x1, x2, ··· in such a way that xn will tend to θ in probability is presented.
Book ChapterDOI

Large-Scale Machine Learning with Stochastic Gradient Descent

Léon Bottou
TL;DR: A more precise analysis uncovers qualitatively different tradeoffs for the case of small-scale and large-scale learning problems.
Book

Graphical Models, Exponential Families, and Variational Inference

TL;DR: The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.
Related Papers (5)