scispace - formally typeset
Open AccessJournal ArticleDOI

Join-graph propagation algorithms

TLDR
Algorithm IJGP belongs to the class of Generalized Belief Propagation algorithms, a framework that allowed connections with approximate algorithms from statistical physics and is shown empirically to surpass the performance of mini-clustering and belief propagation, as well as a number of other state of theart algorithms on several classes of networks.
Abstract
The paper investigates parameterized approximate message-passing schemes that are based on bounded inference and are inspired by Pearl's belief propagation algorithm (BP). We start with the bounded inference mini-clustering algorithm and then move to the iterative scheme called Iterative Join-Graph Propagation (IJGP), that combines both iteration and bounded inference. Algorithm IJGP belongs to the class of Generalized Belief Propagation algorithms, a framework that allowed connections with approximate algorithms from statistical physics and is shown empirically to surpass the performance of mini-clustering and belief propagation, as well as a number of other state-of-the-art algorithms on several classes of networks. We also provide insight into the accuracy of iterative BP and IJGP by relating these algorithms to well known classes of constraint propagation schemes

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

SampleSearch: Importance sampling in presence of determinism

TL;DR: The SampleSearch scheme is proposed that augments sampling with systematic constraint-based backtracking search, and a weighting scheme is derived which yields an unbiased estimate of the desired statistics (e.g., probability of evidence).
Proceedings Article

Learning Stochastic Inverses

TL;DR: The Inverse MCMC algorithm is described, which uses stochastic inverses to make block proposals for a Metropolis-Hastings sampler, and the efficiency of this sampler for a variety of parameter regimes and Bayes nets is explored.
Book

Reasoning with Probabilistic and Deterministic Graphical Models: Exact Algorithms

TL;DR: This book provides comprehensive coverage of the primary exact algorithms for reasoning with graphical models, and believes the principles outlined here would serve well in moving forward to approximation and anytime-based schemes.
Proceedings Article

Variational algorithms for marginal MAP

TL;DR: In this paper, a general variational framework for solving marginal MAP problems is proposed, in which they apply analogues of the Bethe, tree-reweighted, and mean field approximations.
Proceedings Article

Join-graph based cost-shifting schemes

TL;DR: In this paper, the authors develop several algorithms taking advantage of two common approaches for bounding MPE queries in graphical models: mini-bucket elimination and message-passing updates for linear programming relaxations.
References
More filters
Book

Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference

TL;DR: Probabilistic Reasoning in Intelligent Systems as mentioned in this paper is a complete and accessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty, and provides a coherent explication of probability as a language for reasoning with partial belief.
Book

Low-Density Parity-Check Codes

TL;DR: A simple but nonoptimum decoding scheme operating directly from the channel a posteriori probabilities is described and the probability of error using this decoder on a binary symmetric channel is shown to decrease at least exponentially with a root of the block length.
Proceedings ArticleDOI

Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1

TL;DR: In this article, a new class of convolutional codes called turbo-codes, whose performances in terms of bit error rate (BER) are close to the Shannon limit, is discussed.
Related Papers (5)