The FEDHC Bayesian Network Learning Algorithm
TLDR
In this paper , a hybrid Bayesian network learning algorithm, termed Forward Early Dropping Hill Climbing (FEDHC), was proposed to work with either continuous or categorical variables.Abstract:
The paper proposes a new hybrid Bayesian network learning algorithm, termed Forward Early Dropping Hill Climbing (FEDHC), devised to work with either continuous or categorical variables. Further, the paper manifests that the only implementation of MMHC in the statistical software \textit{R}, is prohibitively expensive and a new implementation is offered. Further, specifically for the case of continuous data, a robust to outliers version of FEDHC, that can be adopted by other BN learning algorithms, is proposed. The FEDHC is tested via Monte Carlo simulations that distinctly show it is computationally efficient, and produces Bayesian networks of similar to, or of higher accuracy than MMHC and PCHC. Finally, an application of FEDHC, PCHC and MMHC algorithms to real data, from the field of economics, is demonstrated using the statistical software \textit{R}. read more
Citations
More filters
Journal ArticleDOI
A Decade for the Mathematics: Bibliometric Analysis of Mathematical Modeling in Economics, Ecology, and Environment
TL;DR: In this article , the authors present a retrospective of the publications related to the use of mathematical tools for the analysis of economic, ecological, and environmental phenomena, and analyze 1257 scientific publications using bibliometric techniques to examine the most productive and influential authors and their contributions.
Journal ArticleDOI
Special Issue “Statistical Data Modeling and Machine Learning with Applications II”
TL;DR: In this article , the authors describe the rapid progress and synergy between mathematics and computer science, and they propose a framework for combining mathematics and software engineering in the context of computer science.
References
More filters
Journal ArticleDOI
Estimating the Dimension of a Model
TL;DR: In this paper, the problem of selecting one of a number of models of different dimensions is treated by finding its Bayes solution, and evaluating the leading terms of its asymptotic expansion.
Journal ArticleDOI
The max-min hill-climbing Bayesian network structure learning algorithm
TL;DR: The first empirical results simultaneously comparing most of the major Bayesian network algorithms against each other are presented, namely the PC, Sparse Candidate, Three Phase Dependency Analysis, Optimal Reinsertion, Greedy Equivalence Search, and Greedy Search.
Journal ArticleDOI
Learning Bayesian Networks with the bnlearn R Package
TL;DR: Thebnlearn as discussed by the authors is an R package (R Development Core Team 2010) which includes several algorithms for learning the structure of Bayesian networks with either discrete or continuous variables and can use the functionality provided by the snow package (Tierney et al. 2008) to improve their performance via parallel computing.
Journal ArticleDOI
Learning bayesian belief networks: an approach based on the mdl principle
Wai Lam,Fahiem Bacchus +1 more
TL;DR: A new approach for learning Bayesian belief networks from raw data is presented, based on Rissanen's minimal description length (MDL) principle, which can learn unrestricted multiply‐connected belief networks and allows for trade off accuracy and complexity in the learned model.