scispace - formally typeset
Search or ask a question
Topic

Decision tree model

About: Decision tree model is a research topic. Over the lifetime, 2256 publications have been published within this topic receiving 38142 citations.


Papers
More filters
Journal Article
TL;DR: The computational complexity of languages with interactive proofs of logarithmic knowledge complexity was studied in this article, where it was shown that all such languages can be recognized in BPP and NP.
Abstract: We study the computational complexity of languages which have interactive proofs of logarithmic knowledge complexity. We show that all such languages can be recognized in ${\cal BPP}^{\cal NP}$. Prior to this work, for languages with greater-than-zero knowledge complexity only trivial computational complexity bounds were known. In the course of our proof, we relate statistical knowledge complexity to perfect knowledge complexity; specifically, we show that, for the honest verifier, these hierarchies coincide up to a logarithmic additive term.

28 citations

Journal ArticleDOI
TL;DR: A simple decision tree model borrowed from operations research is presented to provide a conceptual framework for considering whether or not to commit resources to evaluate the evaluation.
Abstract: The act of evaluation requires an expenditure of resources. In Part I of this paper, we present a simple decision tree model borrowed from operations research to provide a conceptual framework for considering whether or not to commit such resources. In Part II, once the evaluation is carried out, we address the problem of evaluating the evaluation as a vehicle for producing useful information to decisionmakers. Evaluation inputs, processes, and outcomes are defined and discussed within the context of comprehensive evaluation of evaluations.

28 citations

Proceedings ArticleDOI
13 Dec 2009
TL;DR: In this article, a rule-based decision tree (RBDT-1) is proposed for learning a decision tree from a set of decision rules that cover the data instances rather than from data instances themselves.
Abstract: Most of the methods that generate decision trees use examples of data instances in the decision tree generation process. This paper proposes a method called "RBDT-1"- rule based decision tree -for learning a decision tree from a set of decision rules that cover the data instances rather than from the data instances themselves. RBDT-1 method uses a set of declarative rules as an input for generating a decision tree. The method’s goal is to create on-demand a short and accurate decision tree from a stable or dynamically changing set of rules. We conduct a comparative study of RBDT-1 with three existing decision tree methods based on different problems. The outcome of the study shows that RBDT-1 performs better than AQDT-1 and AQDT-2 which are methods that create decision trees from rules and than ID3 which generates decision trees from data examples, in terms of tree complexity (number of nodes and leaves in the decision tree.

28 citations

Proceedings ArticleDOI
27 May 2001
TL;DR: An innovative evolutionary computation approach combining statistical sampling, a genetic algorithm and a decision tree to develop intelligent decision trees that alleviates some of the problems of scalability and efficiency in the implementation.
Abstract: Decision tree algorithms have been widely used in dealing with data mining problems. However, scalability and efficiency are significant concerns in the implementation. We propose an innovative evolutionary computation approach combining statistical sampling, a genetic algorithm and a decision tree, to develop intelligent decision trees that alleviates some of these problems. Computational results show that our approach can obtain significantly better decision trees at lower sampling levels than the standard decision tree algorithm.

27 citations

Journal ArticleDOI
TL;DR: This paper is aimed at identifying the best machine learning models using Naive Bayes, Decision Tree and k-Nearest Neighbors algorithm for classifying the B40 population in Malaysia and demonstrates that the overall performance of Decision Tree model outperformed the other models.
Abstract: Malaysia citizens are categorised into three different income groups which are the Top 20 Percent (T20), Middle 40 Percent (M40), and Bottom 40 Percent (B40). One of the focus areas in the Eleventh Malaysia Plan (11MP) is to elevate the B40 household group towards the middle-income society. Based on recent studies by the World Bank, Malaysia is expected to enter the high-income economy status no later than the year 2024. Thus, it is essential to clarify the B40 population through a predictive classification as a prerequisite towards developing a comprehensive action plan by the government. This paper is aimed at identifying the best machine learning models using Naive Bayes, Decision Tree and k-Nearest Neighbors algorithm for classifying the B40 population. Several data pre-processing task such as data cleaning, feature engineering, normalisation, feature selection: Correlation Attribute, Information Gain Attribute and Symmetrical Uncertainty Attribute and sampling methods using SMOTE has been conducted to the raw dataset to ensure the quality of the training data. Each classifier is then optimized using different tuning parameter with 10-Fold Cross Validation for achieving the optimal values before the performance of the three classifiers are compared to each other. For the experiments, a dataset from National Poverty Data Bank called eKasih obtained from the Society Wellbeing Department, Implementation Coordination Unit of Prime Minister's Department (ICU JPM), consisting of 99,546 households from 3 different states: Johor, Terengganu and Pahang are used to train each of the machine learning model. The experimental results using 10-Fold Cross-Validation method demonstrates that the overall performance of Decision Tree model outperformed the other models and the significance test specified the result is statistically significance.

27 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
80% related
Artificial neural network
207K papers, 4.5M citations
78% related
Fuzzy logic
151.2K papers, 2.3M citations
77% related
The Internet
213.2K papers, 3.8M citations
77% related
Deep learning
79.8K papers, 2.1M citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202310
202224
2021101
2020163
2019158
2018121