scispace - formally typeset
Search or ask a question
Topic

Decision tree model

About: Decision tree model is a research topic. Over the lifetime, 2256 publications have been published within this topic receiving 38142 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: Since the decision tree model which provides better explanation than neural network model can well predict the LoS of appendicitis patients and can also be used to select the input variables, it is recommended that hospitals make use of the decision Tree techniques more actively.
Abstract: For the efficient management of hospital sickbeds, it is important to predict the length of stay (LoS) of appendicitis patients. This study analyzed the patient data to find factors that show high positive correlation with LoS, build LoS prediction models using neural network and decision tree models, and compare their performance. In order to increase the prediction accuracy, we applied the ensemble techniques such as bagging and boosting. Experimental results show that decision tree model which was built with less number of variables shows prediction accuracy almost equal to that of neural network model, and that bagging is better than boosting. In conclusion, since the decision tree model which provides better explanation than neural network model can well predict the LoS of appendicitis patients and can also be used to select the input variables, it is recommended that hospitals make use of the decision tree techniques more actively.

2 citations

Patent
11 Jul 2019
TL;DR: In this paper, a generalized linear model structure definition by generating a gradient boosted tree model and separating each decision tree into a plurality of indicator variables upon which a dependent variable of the generalized linear models depends is provided.
Abstract: An apparatus is provided for generating a generalized linear model structure definition by generating a gradient boosted tree model and separating each decision tree into a plurality of indicator variables upon which a dependent variable of the generalized linear model depends. A first number of plurality of decision tree structures each having a maximum tree depth of one (1) is formed, where the first number represents a number of decision tree structures necessary to exhaust all main effects of a plurality of predictor variables on a dependent variable. Successive pluralities of decision tree structures each having a maximum tree depth increased by one (1) as compared to its immediately preceding plurality of decision tree structures are iteratively formed. Each successive plurality of decision tree structures comprises a second number of decision tree structures necessary to exhaust all interactions between the plurality of predictor variables.

2 citations

Journal ArticleDOI
TL;DR: By designing an efficient tree structure, depending on the picture content, the performance of a video coder can be improved by up to 2.0dB, while reducing the computational complexity as well as the memory requirements by almost 29-35%.
Abstract: In this paper we analyze the impact of tree structures on the performance of zerotree-based wavelet video codecs. Since zerotree approach is based on aggregation of insignificant coefficients in trees, therefore design of a tree structure is the key issue for a better performance. We have considered six different tree structures with characteristics varying from a simple to relatively complex and composite tree structures to code the luminance-chrominance components of a video sequence. Their performances are compared in terms of average number of bits generated per bitplane, number of coded bitplanes for a given bit budget, rate-distortion performance, memory requirements and computational complexity. We observe that in general more complex and longer trees do not necessarily improve the coding efficiency. However, the tree structures encapsulating more elements per tree are memory efficient. Therefore, the rate-distortion performance, memory requirements and computational complexities need to be traded-off while selecting a particular tree structure. It is also observed that the additional improvement due to optional entropy coding is also tree structure dependent. Further, the simulation results show that by designing an efficient tree structure, depending on the picture content, the performance of a video coder can be improved by up to 2.0dB, while reducing the computational complexity by 45-60% as well as the memory requirements by almost 29-35%. Compared to the standard JPEG2000 (for intra-frame), tree-based coders are found to be efficient in terms of coding and complexity, particularly at lower bit rates.

2 citations

Journal ArticleDOI
TL;DR: A novel and effective algorithm is introduced for decision tree based on the core of discernibility matrix on rough set theory and the degree of consistent dependence to improve the decision tree on node selection.
Abstract: Rough set theory is a popular mathematical knowledge to resolve problems which are vagueness and uncertainly. And it has been used of solving the redundancy of attribute and data. Decision tree has been widely used in data mining techniques, because it is efficient, fast and easy to be understood in terms of data classification. There are many approaches have been used to generate a decision tree. In this paper, a novel and effective algorithm is introduced for decision tree. This algorithm is based on the core of discernibility matrix on rough set theory and the degree of consistent dependence. This algorithm is to improve the decision tree on node selection. This approach reduces the time complexity of the decision tree production and space complexity compared with ID3.In the end of the article, there is an example of the algorithm can exhibit superiority.

2 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
80% related
Artificial neural network
207K papers, 4.5M citations
78% related
Fuzzy logic
151.2K papers, 2.3M citations
77% related
The Internet
213.2K papers, 3.8M citations
77% related
Deep learning
79.8K papers, 2.1M citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202310
202224
2021101
2020163
2019158
2018121