Open AccessPosted Content
Expectation Propagation for approximate Bayesian inference
TLDR
Expectation Propagation (EP) as mentioned in this paper is a deterministic approximation technique in Bayesian networks that unifies two previous techniques: assumed-density filtering, an extension of the Kalman filter, and loopy belief propagation.Abstract:
This paper presents a new deterministic approximation technique in Bayesian networks. This method, "Expectation Propagation", unifies two previous techniques: assumed-density filtering, an extension of the Kalman filter, and loopy belief propagation, an extension of belief propagation in Bayesian networks. All three algorithms try to recover an approximate distribution which is close in KL divergence to the true distribution. Loopy belief propagation, because it propagates exact belief states, is useful for a limited class of belief networks, such as those which are purely discrete. Expectation Propagation approximates the belief states by only retaining certain expectations, such as mean and variance, and iterates until these expectations are consistent throughout the network. This makes it applicable to hybrid networks with discrete and continuous nodes. Expectation Propagation also extends belief propagation in the opposite direction - it can propagate richer belief states that incorporate correlations between nodes. Experiments with Gaussian mixture models show Expectation Propagation to be convincingly better than methods with similar computational cost: Laplace's method, variational Bayes, and Monte Carlo. Expectation Propagation also provides an efficient algorithm for training Bayes point machine classifiers.read more
Citations
More filters
Pattern Recognition and Machine Learning
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Book
Machine Learning : A Probabilistic Perspective
TL;DR: This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
Journal ArticleDOI
The future of employment: How susceptible are jobs to computerisation?
TL;DR: In this paper, a Gaussian process classifier was used to estimate the probability of computerisation for 702 detailed occupations, and the expected impacts of future computerisation on US labour market outcomes, with the primary objective of analyzing the number of jobs at risk and the relationship between an occupations probability of computing, wages and educational attainment.
Book
Graphical Models, Exponential Families, and Variational Inference
TL;DR: The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.
Journal ArticleDOI
Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations
TL;DR: This work considers approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with non‐Gaussian response variables and can directly compute very accurate approximations to the posterior marginals.
References
More filters
Journal ArticleDOI
Factor graphs and the sum-product algorithm
TL;DR: A generic message-passing algorithm, the sum-product algorithm, that operates in a factor graph, that computes-either exactly or approximately-various marginal functions derived from the global function.
Posted Content
Loopy Belief Propagation for Approximate Inference: An Empirical Study
TL;DR: In this article, the authors compare the performance of loopy belief propagation with the exact ones in four real world networks, including two real-world networks: ALARM and QMR, and find that the loopy beliefs often converge and when they do, they give a good approximation to the correct marginals.
Proceedings Article
Loopy belief propagation for approximate inference: an empirical study
TL;DR: This paper compares the marginals computed using loopy propagation to the exact ones in four Bayesian network architectures, including two real-world networks: ALARM and QMR, and finds that the loopy beliefs often converge and when they do, they give a good approximation to the correct marginals.
Proceedings Article
Generalized Belief Propagation
TL;DR: It is shown that BP can only converge to a stationary point of an approximate free energy, known as the Bethe free energy in statistical physics, and generalized belief propagation (GBP) versions of these Kikuchi approximations are derived.