scispace - formally typeset
Open AccessProceedings Article

Expectation propagation for approximate Bayesian inference

TLDR
Expectation Propagation approximates the belief states by only retaining expectations, such as mean and varitmce, and iterates until these expectations are consistent throughout the network, which makes it applicable to hybrid networks with discrete and continuous nodes.
Abstract
This paper presents a new deterministic approximation technique in Bayesian networks. This method, "Expectation Propagation," unifies two previous techniques: assumed-density filtering, an extension of the Kalman filter, and loopy belief propagation, an extension of belief propagation in Bayesian networks. Loopy belief propagation, because it propagates exact belief states, is useful for a limited class of belief networks, such as those which are purely discrete. Expectation Propagation approximates the belief states by only retaining expectations, such as mean and varitmce, and iterates until these expectations are consistent throughout the network. This makes it applicable to hybrid networks with discrete and continuous nodes. Experiments with Gaussian mixture models show Expectation Propagation to be donvincingly better than methods with similar computational cost: Laplace's method, variational Bayes, and Monte Carlo. Expectation Propagation also provides an efficient algorithm for training Bayes point machine classifiers.

read more

Citations
More filters
Journal ArticleDOI

The future of employment: How susceptible are jobs to computerisation?

TL;DR: In this paper, a Gaussian process classifier was used to estimate the probability of computerisation for 702 detailed occupations, and the expected impacts of future computerisation on US labour market outcomes, with the primary objective of analyzing the number of jobs at risk and the relationship between an occupations probability of computing, wages and educational attainment.
Book

Graphical Models, Exponential Families, and Variational Inference

TL;DR: The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.
Journal ArticleDOI

Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations

TL;DR: This work considers approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with non‐Gaussian response variables and can directly compute very accurate approximations to the posterior marginals.
Journal ArticleDOI

Variational Inference: A Review for Statisticians

TL;DR: For instance, mean-field variational inference as discussed by the authors approximates probability densities through optimization, which is used in many applications and tends to be faster than classical methods, such as Markov chain Monte Carlo sampling.
Journal ArticleDOI

Collective Classification in Network Data

TL;DR: This article introduces four of the most widely used inference algorithms for classifying networked data and empirically compare them on both synthetic and real-world data.
References
More filters
Journal ArticleDOI

Factor graphs and the sum-product algorithm

TL;DR: A generic message-passing algorithm, the sum-product algorithm, that operates in a factor graph, that computes-either exactly or approximately-various marginal functions derived from the global function.
Proceedings Article

Loopy belief propagation for approximate inference: an empirical study

TL;DR: This paper compares the marginals computed using loopy propagation to the exact ones in four Bayesian network architectures, including two real-world networks: ALARM and QMR, and finds that the loopy beliefs often converge and when they do, they give a good approximation to the correct marginals.
Proceedings Article

Generalized Belief Propagation

TL;DR: It is shown that BP can only converge to a stationary point of an approximate free energy, known as the Bethe free energy in statistical physics, and generalized belief propagation (GBP) versions of these Kikuchi approximations are derived.
Related Papers (5)