Book ChapterDOI
Analyzing Random Forest Classifier with Different Split Measures
Vrushali Kulkarni,Manisha Petare,Pradeep K. Sinha +2 more
- pp 691-699
Reads0
Chats0
TLDR
T theoretical and empirical comparison of different split measures for induction of decision tree in Random forest are done and if there is any effect on the accuracy of Random forest is tested.Abstract:
Random forest is an ensemble supervised machine learning technique. The principle of ensemble suggests that to yield better accuracy, the base classifiers in the ensemble should be diverse and accurate. Random forest uses decision tree as base classifier. In this paper, we have done theoretical and empirical comparison of different split measures for induction of decision tree in Random forest and tested if there is any effect on the accuracy of Random forest.read more
Citations
More filters
Journal ArticleDOI
A Random Forest approach using imprecise probabilities
TL;DR: The base classifier of the Random Forest is modified using a new criterion which uses imprecise probabilities and general uncertainty measures, producing also a new single decision tree model, called Credal Random Forest.
Journal ArticleDOI
Increasing diversity in random forest learning algorithm via imprecise probabilities
TL;DR: This new algorithm, called Random Credal Random Forest (RCRF), represents several improvements with respect to the classic RF: the use of a more successful split criterion which is more robust to noise than the classic ones; and an increasing of the randomness which facilitates the diversity of the rules obtained.
Effective Learning and Classification using Random Forest Algorithm
TL;DR: An attempt is made to improve performance of Random Forest classifiers in terms of accuracy, and time required for learning and classification, to achieve this, five new approaches are proposed.
Journal ArticleDOI
Weighted Hybrid Decision Tree Model for Random Forest Classifier
TL;DR: A new approach of hybrid decision tree model for random forest classifier is proposed, which is augmented by weighted voting based on the strength of individual tree and has shown notable increase in the accuracy of random forest.
Journal ArticleDOI
Analytical Comparison Between the Information Gain and Gini Index using Historical Geographical Data
TL;DR: Information Gain and Gini Index is applied on attributes of Kashmir province to convert continuous data into discrete values and the data set is ready for the application of machine learning (decision tree) algorithms.
References
More filters
Journal ArticleDOI
Random Forests
TL;DR: Internal estimates monitor error, strength, and correlation and these are used to show the response to increasing the number of features used in the forest, and are also applicable to regression.
Book
Data Mining: Concepts and Techniques
TL;DR: This book presents dozens of algorithms and implementation examples, all in pseudo-code and suitable for use in real-world, large-scale data mining projects, and provides a comprehensive, practical look at the concepts and techniques you need to get the most out of real business data.
Journal ArticleDOI
Bagging predictors
TL;DR: Tests on real and simulated data sets using classification and regression trees and subset selection in linear regression show that bagging can give substantial gains in accuracy.
Journal ArticleDOI
Popular ensemble methods: an empirical study
David W. Opitz,Richard Maclin +1 more
TL;DR: This work suggests that most of the gain in an ensemble's performance comes in the first few classifiers combined; however, relatively large gains can be seen up to 25 classifiers when Boosting decision trees.
Journal ArticleDOI
Top-down induction of decision trees classifiers - a survey
Lior Rokach,Oded Maimon +1 more
TL;DR: An updated survey of current methods for constructing decision tree classifiers in a top-down manner is presented and a unified algorithmic framework for presenting these algorithms is suggested.