scispace - formally typeset
Search or ask a question
Topic

Decision tree model

About: Decision tree model is a research topic. Over the lifetime, 2256 publications have been published within this topic receiving 38142 citations.


Papers
More filters
Proceedings ArticleDOI
07 Oct 1992
TL;DR: A previously developed axiomatic model of program complexity is merged with the previously developed decision tree process for an improvement in the ability to identify high cost modules.
Abstract: Identification of high cost modules has been viewed as one mechanism to improve overall system reliability, since such modules tend to produce more than their fair share of problems. A decision tree model has previously been used to identify such modules. In this paper, a previously developed axiomatic model of program complexity is merged with the previously developed decision tree process for an improvement in the ability to identify such modules. This improvement has been tested using data from the NASA Software Engineering Laboratory. >

13 citations

Journal ArticleDOI
Jiawei Li1, Yiming Li1, Xingchun Xiang1, Shu-Tao Xia1, Siyi Dong, Yun Cai 
24 Oct 2020-Entropy
TL;DR: A Tree-Network-Tree (TNT) learning framework for explainable decision-making, where the knowledge is alternately transferred between the tree model and DNNs is proposed, and extensive experiments demonstrated the effectiveness of the proposed method.
Abstract: Deep Neural Networks (DNNs) usually work in an end-to-end manner. This makes the trained DNNs easy to use, but they remain an ambiguous decision process for every test case. Unfortunately, the interpretability of decisions is crucial in some scenarios, such as medical or financial data mining and decision-making. In this paper, we propose a Tree-Network-Tree (TNT) learning framework for explainable decision-making, where the knowledge is alternately transferred between the tree model and DNNs. Specifically, the proposed TNT learning framework exerts the advantages of different models at different stages: (1) a novel James–Stein Decision Tree (JSDT) is proposed to generate better knowledge representations for DNNs, especially when the input data are in low-frequency or low-quality; (2) the DNNs output high-performing prediction result from the knowledge embedding inputs and behave as a teacher model for the following tree model; and (3) a novel distillable Gradient Boosted Decision Tree (dGBDT) is proposed to learn interpretable trees from the soft labels and make a comparable prediction as DNNs do. Extensive experiments on various machine learning tasks demonstrated the effectiveness of the proposed method.

13 citations

Proceedings ArticleDOI
23 Jul 2010
TL;DR: The method proposed can be used in E-Learning evaluation system to evaluate students' learning behavior and the accuracy rate of it can reach almost 86.7%.
Abstract: The application of data mining in E-Learning evaluation system is discussed in this paper. Students' learning behavior data are first collected and analyzed. Then, with the support of data mining technology, the relationship between learning behavior and learning effect is studied and the decision tree model is established by using J4.8 algorithm on the Weka platform. Finally, the model is assessed and the accuracy rate of it can reach almost 86.7%. Thus the method proposed can be used in E-Learning evaluation system to evaluate students' learning behavior.

13 citations

Proceedings ArticleDOI
01 Aug 1996
TL;DR: Empirical results show how expected utility increases with the size of the tree and the number of Bayesian net calculations.
Abstract: We report on work towards flexible algorithms for solving decision problems represented as influence diagrams. An algorithm is given to construct a tree structure for each decision node in an influence diagram. Each tree represents a decision function and is constructed incrementally. The improvements to the tree converge to the optimal decision function (neglecting computational costs) and the asymptotic behaviour is only a constant factor worse than dynamic programming techniques, counting the number of Bayesian network queries. Empirical results show how expected utility increases with the size of the tree and the number of Bayesian net calculations.

13 citations

Proceedings ArticleDOI
21 Nov 2007
TL;DR: A simple method for language identification that is based on adaptive resonance learning (ART) neural network is applied and the experimented result shows that the decision tree model achieved highest accuracy than ARTMAP model.
Abstract: Automatic language identification (LID) is a topic of great significance in areas of intelligent and security, where the language identities of any related materials need to be identified before any information can be processed. When the recognition elements of any content is dynamic and obtained directly from written text, the language associated with each grammar item has to be identified using that text. Many methods have been proposed in the literature are focusing on Roman and Asian languages. This paper describes text-based language identification approaches on Arabic script. Two different approaches have been compared. The decision trees method commonly used in many application domain is firstly reviewed. We also applied a simple method for language identification that is based on adaptive resonance learning (ART) neural network. The experimented result shows that the decision tree model achieved highest accuracy than ARTMAP model. However, decision tree model may not reliable if the language used extends to others Arabic script compared to ARTMAP model. It is assumed that hybrid of both models will perform better and merit for further development.

13 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
80% related
Artificial neural network
207K papers, 4.5M citations
78% related
Fuzzy logic
151.2K papers, 2.3M citations
77% related
The Internet
213.2K papers, 3.8M citations
77% related
Deep learning
79.8K papers, 2.1M citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202310
202224
2021101
2020163
2019158
2018121