scispace - formally typeset
Open AccessPosted Content

Partial Order MCMC for Structure Discovery in Bayesian Networks

TLDR
In this paper, a Markov chain Monte Carlo method for estimating posterior probabilities of structural features in Bayesian networks is presented, which draws samples from the posterior distribution of partial orders on the nodes.
Abstract
We present a new Markov chain Monte Carlo method for estimating posterior probabilities of structural features in Bayesian networks. The method draws samples from the posterior distribution of partial orders on the nodes; for each sampled partial order, the conditional probabilities of interest are computed exactly. We give both analytical and empirical results that suggest the superiority of the new method compared to previous methods, which sample either directed acyclic graphs or linear orders on the nodes.

read more

Citations
More filters
Posted Content

DAG-GNN: DAG Structure Learning with Graph Neural Networks

TL;DR: In this paper, a deep generative model is proposed to learn a directed acyclic graph from samples of a joint distribution, and a variant of the structural constraint is applied to learn the DAG.
Journal ArticleDOI

Partition MCMC for Inference on Acyclic Digraphs

TL;DR: A novel algorithm is proposed, which employs the underlying combinatorial structure of DAGs to define a new grouping, and convergence is improved compared to structure MCMC, while still retaining the property of producing an unbiased sample.
Journal ArticleDOI

Partition MCMC for inference on acyclic digraphs

TL;DR: In this paper, the authors propose a new algorithm which employs the underlying combinatorial structure of DAGs to define a new grouping, which can be combined with edge reversal moves to improve the sampler further.
Journal Article

Structure discovery in Bayesian networks by sampling partial orders

TL;DR: The empirical results demonstrate that the presented partial-order-based samplers are superior to previous Markov chain Monte Carlo methods, which sample DAGs either directly or via linear orders on the nodes, and suggest that the convergence rate of the estimators based on AIS are competitive to those of MC3.
Proceedings Article

Annealed importance sampling for structure learning in Bayesian networks

TL;DR: This work presents a new sampling approach to Bayesian learning of the Bayesian network structure that replaces the usual Markov chain Monte Carlo method by the method of annealed importance sampling (AIS), and shows that AIS is not only competitive to MCMC in exploring the posterior, but also superior in two ways.
References
More filters
Journal ArticleDOI

Learning Bayesian Networks: The Combination of Knowledge and Statistical Data

TL;DR: In this article, a Bayesian approach for learning Bayesian networks from a combination of prior knowledge and statistical data is presented, which is derived from a set of assumptions made previously as well as the assumption of likelihood equivalence, which says that data should not help to discriminate network structures that represent the same assertions of conditional independence.
Journal ArticleDOI

A Bayesian Method for the Induction of Probabilistic Networks from Data

TL;DR: This paper presents a Bayesian method for constructing probabilistic networks from databases, focusing on constructing Bayesian belief networks, and extends the basic method to handle missing data and hidden variables.
Book

Learning Bayesian networks

TL;DR: This chapter discusses Bayesian Networks, a framework for Bayesian Structure Learning, and some of the algorithms used in this framework.
Journal ArticleDOI

Bayesian Graphical Models for Discrete Data

TL;DR: In this paper, the authors introduce the composition of chaines de Markov-Monte Carlo, a methode de Monte Carlo permettant de moyenner sur les modeles retenus.
Journal ArticleDOI

Being Bayesian About Network Structure. A Bayesian Approach to Structure Discovery in Bayesian Networks

TL;DR: This paper shows how to efficiently compute a sum over the exponential number of networks that are consistent with a fixed order over network variables, and uses this result as the basis for an algorithm that approximates the Bayesian posterior of a feature.