scispace - formally typeset
Open AccessProceedings Article

Inference in hybrid Bayesian networks with Mixtures of Truncated Basis Functions

Reads0
Chats0
TLDR
A structure for handling probability potentials called Sum-Product factorized potentials, and it is shown how these potentials facilitate ecient inference based on properties of the MoTBFs and ideas similar to the ones underlying Lazy propagation (postponing operations and keeping factorized representations of the potentials).
Abstract
In this paper we study the problem of exact inference in hybrid Bayesian networks using mixtures of truncated basis functions (MoTBFs). We propose a structure for handling probability potentials called Sum-Product factorized potentials, and show how these potentials facilitate ecient inference based on i) properties of the MoTBFs and ii) ideas similar to the ones underlying Lazy propagation (postponing operations and keeping factorized representations of the potentials). We report on preliminary experiments demonstrating the eciency of the proposed method in comparison with existing algorithms.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

A Review of Inference Algorithms for Hybrid Bayesian Networks

TL;DR: An overview of the main trends and principled approaches for performing inference in hybrid Bayesian networks is provided and an overview of established software systems supporting inference in these types of models is provided.
Journal ArticleDOI

Learning mixtures of polynomials of multidimensional probability densities from data using B-spline interpolation

TL;DR: Results on real datasets show that the non-parametric Bayesian classifiers using MoPs are comparable to the kernel density-based Bayesianclassifiers.
Journal ArticleDOI

An improved method for solving Hybrid Influence Diagrams

TL;DR: This paper solves a HID by transforming it to a Hybrid Bayesian Network (HBN) and carrying out inference on this HBN using Dynamic Discretization (DD).
Journal ArticleDOI

Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks

TL;DR: A scalable importance sampling algorithm for computing Gaussian mixture posteriors in conditional linear Gaussian Bayesian networks using a stochastic gradient ascent procedure taking as input a stream of importance sampling weights, so that a mixture of Gaussians is dynamically updated with no need to store the full sample.

Proceedings of the Fifth European Workshop on Probabilistic Graphical Models

TL;DR: The detailed session logs that are collected during use of the query-Based Diagnostics reveal how the model evolved in use, and pose challenging questions on how the models can be adapted to manufacturing.
References
More filters
Journal ArticleDOI

Propagation of Probabilities, Means, and Variances in Mixed Graphical Association Models

TL;DR: The purpose of this article is to extend the local structure in the specification of a discrete probability model for fast and efficient computation, thereby paving the way for exploiting probability-based models as parts of realistic systems for planning and decision support.
Journal ArticleDOI

Stable local computation with conditional Gaussian distributions

TL;DR: A propagation scheme for Bayesian networks with conditional Gaussian distributions that does not have the numerical weaknesses of the scheme derived in Lauritzen and Spiegelhalter is described.
Journal ArticleDOI

LAZY propagation: a junction tree inference algorithm based on lazy evaluation

TL;DR: A junction tree based inference architecture exploiting the structure of the original Bayesian network and independence relations induced by evidence to improve the efficiency of inference is presented.
Book ChapterDOI

Mixtures of Truncated Exponentials in Hybrid Bayesian Networks

TL;DR: The properties of the MTE distribution are studied and it is shown how exact probability propagation can be carried out by means of a local computation algorithm.
Journal ArticleDOI

Inference in hybrid Bayesian networks using mixtures of polynomials

TL;DR: The main goal of this paper is to describe inference in hybrid Bayesian networks (BNs) using mixture of polynomials (MOP) approximations of probability density functions (PDFs), which are similar in spirit to using mixtures of truncated exponentials (MTEs) approxims.
Related Papers (5)