scispace - formally typeset
Search or ask a question
Topic

Decision tree model

About: Decision tree model is a research topic. Over the lifetime, 2256 publications have been published within this topic receiving 38142 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: An IVDT for categorical input attributes has been developed and experimented on 20 subjects to test three hypotheses regarding its potential advantages, and the experimental results suggested that the IVDT process can improve the effectiveness of modeling in terms of producing trees with relatively high classification accuracies and small sizes.
Abstract: The loosely coupled relationships between visualization and analytical data mining (DM) techniques represent the majority of the current state of art in visual data mining; DM modeling is typically an automatic process with very limited forms of guidance from users. A conceptual model of the visualization support to DM modeling process and a novel interactive visual decision tree (IVDT) classification process have been proposed in this paper, with the aim of exploring humans' pattern recognition ability and domain knowledge to facilitate the knowledge discovery process. An IVDT for categorical input attributes has been developed and experimented on 20 subjects to test three hypotheses regarding its potential advantages. The experimental results suggested that, compared to the automatic modeling process as typically applied in current decision tree modeling tools, IVDT process can improve the effectiveness of modeling in terms of producing trees with relatively high classification accuracies and small sizes, enhance users' understanding of the algorithm, and give them greater satisfaction with the task.

32 citations

Journal ArticleDOI
TL;DR: This paper proposes a memory-constrained tree search (MCTS) algorithm that bridges the gap between the sphere decoding (SD) and stack algorithms and proposes novel ordering schemes that can be easily embedded in the QR decomposition.
Abstract: Hardware implementations of tree search-based multiple-input multiple-output (MIMO) detection often have limited performance due to large memory requirement or high computational complexity of sophisticated MIMO detection algorithms. In this paper, we propose new tree search-based detection algorithms that achieve maximum-likelihood (ML) performance under any given memory constraints and with reduced computational complexity. To this end, we make two main contributions. First, we propose a memory-constrained tree search (MCTS) algorithm that bridges the gap between the sphere decoding (SD) and stack algorithms. Our MCTS algorithm dynamically adapts to any pre-specified memory constraint and offers a graceful tradeoff between computational complexity and memory requirement while maintaining the ML performance. When the memory size is set as the minimum, our MCTS algorithm is similar to the SD algorithm. As the memory size increases, the average computational complexity of our MCTS algorithm decreases. When the memory size becomes large, our MCTS algorithm is similar to the stack algorithm, having similar average computational complexity but requiring significantly less memory. To further reduce the computational complexity of tree search-based ML detection algorithms, we propose novel ordering schemes that can be easily embedded in the QR decomposition and take into account both the channel matrix and the received signal (noise); simulation results show that our ordering schemes lead to reduced average computational complexity for the SD and MCTS algorithms, and the reduction is significant at low to medium signal-to-noise ratio region.

32 citations

Proceedings ArticleDOI
04 Sep 2019
TL;DR: This paper adopted the Shapley additive explanation (SHAP) for interpreting a gradient-boosting decision tree model using hospital data and proposes two novel techniques, a new metric of feature importance using SHAP and a technique termed feature packing, which packs multiple similar features into one grouped feature to allow an easier understanding of the model without reconstruction of themodel.
Abstract: When using machine learning techniques in decision-making processes, the interpretability of the models is important. In the present paper, we adopted the Shapley additive explanation (SHAP), which is based on fair profit allocation among many stakeholders depending on their contribution, for interpreting a gradient-boosting decision tree model using hospital data. For better interpretability, we propose two novel techniques as follows: (1) a new metric of feature importance using SHAP and (2) a technique termed feature packing, which packs multiple similar features into one grouped feature to allow an easier understanding of the model without reconstruction of the model.

32 citations

Journal ArticleDOI
Abstract: A survey is given of techniques for evaluation of risk in individual capital investment projects. The paper identifies the four types of relationships affecting project uncertainty: (1) Accounting-type relationships defining cash flow; (2) Statistical relationships among variables in a given time period; (3) Autocorrelation relationships among cash flows over time; and (4) Uncertainty about project life. Two types of decisions also can affect project profitability and uncertainty: (1) Strategy decisions; and (2) Abandonment decisions. Four types of models for risk evaluation are identified: (1) Certainty model; (2) Hillier model; (3) Monte Carlo model; and (4) Decision Tree model. These four types of models are compared and evaluated in terms of how easily they can incorporate the relationships and decisions mentioned above. Computational issues are also discussed. Suggestions are made for further research.

32 citations

Journal ArticleDOI
01 Mar 2014
TL;DR: It is shown that associative classifiers consisting of an ordered rule set can be represented as a tree model, i.e., condition-based tree (CBT), which has competitive accuracy performance, and has a significantly smaller number of rules than well-known associated classifiers such as CBA and GARC.
Abstract: Associative classifiers have been proposed to achieve an accurate model with each individual rule being interpretable. However, existing associative classifiers often consist of a large number of rules and, thus, can be difficult to interpret. We show that associative classifiers consisting of an ordered rule set can be represented as a tree model. From this view, it is clear that these classifiers are restricted in that at least one child node of a non-leaf node is never split. We propose a new tree model, i.e., condition-based tree (CBT), to relax the restriction. Furthermore, we also propose an algorithm to transform a CBT to an ordered rule set with concise rule conditions. This ordered rule set is referred to as a condition-based classifier (CBC). Thus, the interpretability of an associative classifier is maintained, but more expressive models are possible. The rule transformation algorithm can be also applied to regular binary decision trees to extract an ordered set of rules with simple rule conditions. Feature selection is applied to a binary representation of conditions to simplify/improve the models further. Experimental studies show that CBC has competitive accuracy performance, and has a significantly smaller number of rules (median of 10 rules per data set) than well-known associative classifiers such as CBA (median of 47) and GARC (median of 21). CBC with feature selection has even a smaller number of rules.

32 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
80% related
Artificial neural network
207K papers, 4.5M citations
78% related
Fuzzy logic
151.2K papers, 2.3M citations
77% related
The Internet
213.2K papers, 3.8M citations
77% related
Deep learning
79.8K papers, 2.1M citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202310
202224
2021101
2020163
2019158
2018121