scispace - formally typeset
Open AccessBook ChapterDOI

A re-definition of mixtures of polynomials for inference in hybrid Bayesian networks

TLDR
This relaxation means that MOPs are closed under transformations required for multi-dimensional linear deterministic conditionals, such as Z = X + Y, which allows us to construct MOP approximations of the probability density functions (PDFs) of theMulti-dimensional conditional linear Gaussian distributions using a MOP approximation of the PDF of the univariate standard normal distribution.
Abstract
We discuss some issues in using mixtures of polynomials (MOPs) for inference in hybrid Bayesian networks. MOPs were proposed by Shenoy and West for mitigating the problem of integration in inference in hybrid Bayesian networks. In definingMOP for multi-dimensional functions, one requirement is that the pieces where the polynomials are defined are hypercubes. In this paper, we discuss relaxing this condition so that each piece is defined on regions called hyper-rhombuses. This relaxation means that MOPs are closed under transformations required for multi-dimensional linear deterministic conditionals, such as Z = X + Y. Also, this relaxation allows us to construct MOP approximations of the probability density functions (PDFs) of the multi-dimensional conditional linear Gaussian distributions using a MOP approximation of the PDF of the univariate standard normal distribution. We illustrate our method using conditional linear Gaussian PDFs in two and three dimensions.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Two issues in using mixtures of polynomials for inference in hybrid Bayesian networks

TL;DR: A new method for finding MOP approximations based on Lagrange interpolating polynomials (LIP) with Chebyshev points is described, and how the LIP method can be used to find efficient MOP approximation of PDFs is described.
Proceedings Article

Inference in hybrid Bayesian networks with Mixtures of Truncated Basis Functions

TL;DR: A structure for handling probability potentials called Sum-Product factorized potentials, and it is shown how these potentials facilitate ecient inference based on properties of the MoTBFs and ideas similar to the ones underlying Lazy propagation (postponing operations and keeping factorized representations of the potentials).

ProbModelXML. A format for encoding probabilistic graphical models

TL;DR: XML can represent several kinds of models, such as Bayesian networks, Markov networks, influence diagrams, LIMIDs, decision analysis networks, as well as tempo- ral models, and the possibility of encoding new types of networks and user-specific properties without the need to modify the format definition.
Dissertation

Regularized model learning in EDAs for continuous andmulti-objective optimization

TL;DR: The optimization results obtained for a number of continuous problems with an increasing number of variables show that the proposed EDA based on regularized model estimation performs a more robust optimization, and is able to achieve signi�cantly better results for larger dimensions than other Gaussian-based EDAs.
References
More filters
Posted Content

On Information and Sufficiency

TL;DR: The information deviation between any two finite measures cannot be increased by any statistical operations (Markov morphisms) and is invarient if and only if the morphism is sufficient for these two measures as mentioned in this paper.
BookDOI

Symbolic and quantitative approaches to reasoning with uncertainty

TL;DR: Symbolic and quantitative approaches to reasoning with uncertainty as mentioned in this paper have been proposed to reason with uncertainty in the context of computer vision applications, such as decision support systems and artificial neural networks.
Book ChapterDOI

Mixtures of Truncated Exponentials in Hybrid Bayesian Networks

TL;DR: The properties of the MTE distribution are studied and it is shown how exact probability propagation can be carried out by means of a local computation algorithm.
Journal ArticleDOI

Inference in hybrid Bayesian networks using mixtures of polynomials

TL;DR: The main goal of this paper is to describe inference in hybrid Bayesian networks (BNs) using mixture of polynomials (MOP) approximations of probability density functions (PDFs), which are similar in spirit to using mixtures of truncated exponentials (MTEs) approxims.
Related Papers (5)
Frequently Asked Questions (1)
Q1. What have the authors contributed in "A re-definition of mixtures of polynomials for inference in hybrid bayesian networks" ?

The authors discuss some issues in using mixtures of polynomials ( MOPs ) for inference in hybrid Bayesian networks. In this paper, the authors discuss relaxing this condition so that each piece is defined on regions called hyper-rhombuses.