scispace - formally typeset
Search or ask a question
Topic

Decision tree model

About: Decision tree model is a research topic. Over the lifetime, 2256 publications have been published within this topic receiving 38142 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: This work develops a sound Multiplicative Factorization (MF) of multi-valued NIN-AND tree models for inference of Bayesian Networks (BNs) and demonstrates experimentally significant gain of inference efficiency in both space and time.

8 citations

Journal ArticleDOI
01 Jan 2021
TL;DR: To facilitate the task of building an academic prediction model, historical student academic dataset is used and the main aim is to build the prediction model by different families of the Machine Learning Techniques on the selected dataset for consideration.
Abstract: Data Mining is a field in which hidden information is extracted from a large database by using some algorithms implementation. These algorithms are further divided into some categories like classification, clustering, association rule mining etc according to information we want to extract. Data mining is a field which is widely spread over different areas like telecommunication, marketing, operation, hospitals, hotel industry, education etc. Predicting the academic’s performance and progress of the students has revealed the attention of the young researchers. To facilitate the task of building an academic prediction model, historical student academic dataset is used. In this paper, the contributions are exhibited in two different folds. In the first fold, the main aim is to build the prediction model by different families of the Machine Learning Techniques on the selected dataset for consideration. In the second fold, implementations of different ensemble meta-based model are presented by combining with different classification algorithms of Machine Learning Techniques. Different ensemble meta-based model taken into consideration for implementation are Bagging, AdaBoostM1, RandomSubSpace. The implementation results demonstrate that the ensemble meta-based technique (AdaBoostM1) gained a superior accuracy performance with MultilayerPerceptron Machine Learning technique reaching up to 80.33%.

8 citations

Proceedings ArticleDOI
01 Jan 2020
TL;DR: In this paper, a measure-aware feature reuse mechanism was proposed to reuse the good representation in the previous layer guided by confidence and a measureaware layer growth mechanism was designed to gradually increase the model complexity by performance measure.
Abstract: In multi-label learning, each instance is associated with multiple labels and the crucial task is how to leverage label correlations in building models. Deep neural network methods usually jointly embed the feature and label information into a latent space to exploit label correlations. However, the success of these methods highly depends on the precise choice of model depth. Deep forest is a recent deep learning framework based on tree model ensembles, which does not rely on backpropagation. We consider the advantages of deep forest models are very appropriate for solving multi-label problems. Therefore we design the Multi-Label Deep Forest (MLDF) method with two mechanisms: measure-aware feature reuse and measure-aware layer growth. The measure-aware feature reuse mechanism reuses the good representation in the previous layer guided by confidence. The measure-aware layer growth mechanism ensures MLDF gradually increase the model complexity by performance measure. MLDF handles two challenging problems at the same time: one is restricting the model complexity to ease the overfitting issue; another is optimizing the performance measure on user's demand since there are many different measures in the multi-label evaluation. Experiments show that our proposal not only beats the compared methods over six measures on benchmark datasets but also enjoys label correlation discovery and other desired properties in multi-label learning.

8 citations

Journal ArticleDOI
TL;DR: In this article, a new decision tree (DT) based approach for fast voltage contingency screening and ranking for on-line applications in energy management systems is presented, which is developed to learn all the selected contingencies simultaneously, therefore fewer DTs are required.
Abstract: This paper presents a new decision tree (DT) based approach for fast voltage contingency screening and ranking for on-line applications in energy management systems. The hybrid decision tree model is developed to learn all the selected contingencies simultaneously, therefore fewer DTs are required. To reduce the size and improve the accuracy of the decision tree, the K-class problem is converted into the set of K two-class problems, and separate decision tree modules are trained for each of the two class problems. All the selected contingencies are presented to the filter module, which is trained to separate them in critical and non-critical contingency classes, which reduces the burden on ranking modular DT. The critical contingencies screened out by the filter module are presented to the ranking modular decision tree for their further ranking. To measure the severity of contingencies, bus voltage violation based scalar performance index is used. Full AC load flow is performed to generate the training and testing patterns for the proposed hybrid decision tree, under each contingency. The effectiveness of the proposed approach is tested on IEEE test systems. Once trained, a hybrid decision tree method gives fast and accurate screening and ranking of contingencies for unknown load patterns.

8 citations

Journal ArticleDOI
01 May 1996-Networks
TL;DR: In this article, the authors consider the problem of finding an optimal spanning tree with respect to objective functions which depend on the set of leaves of the tree, and address 18 different such problems and determine their computational complexity.
Abstract: We consider the problem of finding an optimal spanning tree with respect to objective functions which depend on the set of leaves of the tree. We address 18 different such problems and determine their computational complexity. Only a few of the problems examined have been given attention in the existing literature.

8 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
80% related
Artificial neural network
207K papers, 4.5M citations
78% related
Fuzzy logic
151.2K papers, 2.3M citations
77% related
The Internet
213.2K papers, 3.8M citations
77% related
Deep learning
79.8K papers, 2.1M citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202310
202224
2021101
2020163
2019158
2018121