Book ChapterDOI
An Improved Learning Algorithm for Augmented Naive Bayes
Huajie Zhang,Charles X. Ling +1 more
- pp 581-586
TLDR
This work extends Naive Bayes classifier to allow certain dependency relations among attributes, which is more efficient, and produces simpler dependency relation for better comprehensibility, while maintaining very similar predictive accuracy.Abstract:
Data mining applications require learning algorithms to have high predictive accuracy, scale up to large datasets, and produce comprehensible outcomes. Naive Bayes classifier has received extensive attention due to its efficiency, reasonable predictive accuracy, and simplicity. However, the assumption of attribute dependency given class of Naive Bayes is often violated, producing incorrect probability that can affect the success of data mining applications. We extend Naive Bayes classifier to allow certain dependency relations among attributes. Comparing to previous extensions of Naive Bayes, our algorithm is more efficient (more so in problems with a large number of attributes), and produces simpler dependency relation for better comprehensibility, while maintaining very similar predictive accuracy.read more
Citations
More filters
Journal ArticleDOI
A Novel Bayes Model: Hidden Naive Bayes
TL;DR: This paper summarizes the existing improved algorithms and proposes a novel Bayes model: hidden naive Bayes (HNB), which significantly outperforms NB, SBC, NBTree, TAN, and AODE in terms of CLL and AUC.
Book ChapterDOI
Survey of Improving Naive Bayes for Classification
TL;DR: Four main improved approaches to naive Bayes are reviewed and some main directions for future research on Bayesian network classifiers are discussed, including feature selection, structure extension, local learning, and data expansion.
Journal ArticleDOI
Improving Tree augmented Naive Bayes for class probability estimation
TL;DR: This paper investigates the class probability estimation performance of TAN in terms of conditional log likelihood (CLL) and presents a new algorithm to improve its class probabilities estimation performance by the spanning TAN classifiers, which is called Averaged Tree Augmented Naive Bayes (ATAN).
Journal ArticleDOI
On the classification performance of TAN and general Bayesian networks
TL;DR: It is found that the poor performance reported by Friedman et al. are not attributable to the GBN per se, but rather to their use of simple empirical frequencies to estimate GBN parameters, whereas basic parameter smoothing improves GBN performance significantly.
Journal ArticleDOI
Using differential evolution for fine tuning nave Bayesian classifiers and its application for text classification
Diab M. Diab,Khalil M. El Hindi +1 more
TL;DR: The experimental results show that using DE in general and the proposed MPDE algorithm in particular are more convenient for fine-tuning NB than all other methods, including the other two metaheuristic methods (GA, and SA).
References
More filters
Journal ArticleDOI
Bayesian Network Classifiers
TL;DR: Tree Augmented Naive Bayes (TAN) is single out, which outperforms naive Bayes, yet at the same time maintains the computational simplicity and robustness that characterize naive Baye.
Journal ArticleDOI
Very Simple Classification Rules Perform Well on Most Commonly Used Datasets
TL;DR: On most datasets studied, the best of very simple rules that classify examples on the basis of a single attribute is as accurate as the rules induced by the majority of machine learning systems.
Proceedings Article
Learning augmented Bayesian classifiers: A comparison of distribution-based and classification-based approaches.
Eamonn Keogh,Michael J. Pazzani +1 more
TL;DR: This work examines an approach where na ve Bayes is augmented by the addition of correlation arcs between attributes, and explores two methods for finding the set of augmenting arcs, a greedy hillclimbing search, and a novel, more computationally efficient algorithm that is called SuperParent.