scispace - formally typeset
Search or ask a question
Topic

Decision tree model

About: Decision tree model is a research topic. Over the lifetime, 2256 publications have been published within this topic receiving 38142 citations.


Papers
More filters
Journal Article
TL;DR: This paper gives comparative study of attribute selection measures for top down induction of decision tree and represents splitting criterion like Information Gain, Gain Ratio, Gini Index, Jaccard Coefficient, and Least Probable Intersections.
Abstract: Decision Tree techniques are used to build classification models in data mining. A decision tree is a sequential hierarchical tree structure which is composed of decision nodes corresponding to attributes. A decision tree model is based on attribute selection measure. The paper represents splitting criterion like Information Gain, Gain Ratio, Gini Index, Jaccard Coefficient, and Least Probable Intersections. In decision tree construction, the splitting criterion is heuristic for best attribute selection that partitions node dataset. Attribute with best score is chosen as a splitting attribute for a node. Best score is based on either impurity reduction or purity gain. This paper gives comparative study of attribute selection measures for top down induction of decision tree.

5 citations

Book ChapterDOI
20 Apr 2011
TL;DR: This paper proposes a Complex Tree model designed from the beginning to integration tasks, capable of representing most tree structures, defined on both Schema and Instance level, to better work in practical situations.
Abstract: The common approach to integrating XML documents is based on existing formal structures, not originally designed to integration tasks. In this paper we propose a Complex Tree model designed from the beginning to integration tasks, capable of representing most tree structures. The Complex Tree model is defined on both Schema and Instance level, to better work in practical situations. The integration task for Complex Trees is also defined on both levels. A set of explicitly stated criteria for integration is given, to better design future integration algorithms, in respect of the desired aim of integration process. Finally a simple integration algorithm is presented, based on selected criteria.

5 citations

01 Jan 2012
TL;DR: In this paper, decision tree technique is applied on a small set of network data and then build a decision tree model, and incorporate the model's logic into snort signatures or firewall rules.
Abstract: Security of computers and the networks that connect them is increasingly becoming of great significance. Intrusion detection is a mechanism of providing security to computer networks. Although there are some existing mechanisms for Intrusion detection, there is need to improve the performance. Data mining techniques, such as decision tree analysis, offers a semi-automated approach to detect threats. In this paper, decision tree technique is applied on a small set of network data. Then build a decision tree model, and incorporates the model's logic into snort signatures or firewall rules.

5 citations

Patent
27 Aug 2009
TL;DR: In this paper, the authors present a method of generating a representation of a storage network based on a tree model, which includes obtaining an request from a client system to view contents of a node in the tree model.
Abstract: An exemplary embodiment of the present invention provides a method of generating a representation of a storage network. The method includes obtaining an request from a client system to view contents of a node in a tree model. The method also includes receiving tree information corresponding to the node and adding the tree information to the tree model.

5 citations

Patent
Le Benjamin Hoan1, Saurabh Kataria2, Fawaz Nadia2, Grover Aman2, Wang Guoyin2 
14 Feb 2019
TL;DR: In this article, a boosting decision tree model is initialized to zero, and features in the model remain zero while the deep neural network collaborative filtering model is trained and a set of convergence criteria are determined.
Abstract: In an example, features in a boosting decision tree model are initialized to zero, the boosting decision tree model located in a GLMM and connected to a deep neural network collaborative filtering model via a prediction layer. While the features in the boosting decision tree model remain zero, the deep neural network collaborative filtering model is trained. One or more trees in the boosting decision tree model are boosted using logits produced by the training of the deep neural network collaborative filtering model as a margin. The prediction layer is trained using features from the deep neural network collaborative filtering model and features from the boosting decision tree model. It is then determined whether a set of convergence criteria is met. If not, then the deep neural network collaborative filtering model is retrained using the features and the process is repeated until the set of convergence criteria is met.

5 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
80% related
Artificial neural network
207K papers, 4.5M citations
78% related
Fuzzy logic
151.2K papers, 2.3M citations
77% related
The Internet
213.2K papers, 3.8M citations
77% related
Deep learning
79.8K papers, 2.1M citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202310
202224
2021101
2020163
2019158
2018121