scispace - formally typeset
Search or ask a question
Topic

Decision tree model

About: Decision tree model is a research topic. Over the lifetime, 2256 publications have been published within this topic receiving 38142 citations.


Papers
More filters
Proceedings ArticleDOI
05 Dec 2005
TL;DR: Two QRD based tree search algorithms which can be likely candidates for implementation purposes due to their relatively low computational complexity are compared and while both algorithms achieve near-ML performance, the QRD-stack algorithm is shown to be more efficient and has a much lower computational complexity as compared to QRD -M.
Abstract: Due to the capacity achievable, multiple-input/multiple-output (MIMO) systems has gained popularity in recent years. While several data detection algorithms are available for MIMO systems, simple algorithms usually perform unsatisfactorily, while more complex ones are infeasible for hardware implementation. In this paper, we compare two QRD based tree search algorithms which can be likely candidates for implementation purposes due to their relatively low computational complexity. The QRD-M algorithm proposed in (J Yue, et al, 2003) yields near-M L performance while only requiring a fraction of the computational complexity of an ML receiver. The QRD-stack algorithm displays similar performance. The performance of both QRD-M and QRD-stack algorithms are compared and while both algorithms achieve near-ML performance, the QRD-stack algorithm is shown to be more efficient and has a much lower computational complexity as compared to QRD-M.

62 citations

Journal ArticleDOI
TL;DR: For a large class of read‐once formulae that this trivial speed‐up is the best that a Monte Carlo algorithm can achieve, a general lower bound is derived on the Monte Carlo complexity of these formULae.
Abstract: In the boolean decision tree model there is at least a linear gap between the Monte Carlo and the Las Vegas complexity of a function depending on the error probability. We prove for a large class of read‐once formulae that this trivial speed‐up is the best that a Monte Carlo algorithm can achieve. For every formula F belonging to that class we show that the Monte Carlo complexity of F with two‐sided error p is (1 − 2p)R(F), and with one‐sided error p is (1 − p)R(F), where R(F) denotes the Las Vegas complexity of F. The result follows from a general lower bound that we derive on the Monte Carlo complexity of these formulae. This bound is analogous to the lower bound due to Saks and Wigderson on their Las Vegas complexity. © 1995 Wiley Periodicals, Inc.

62 citations

Journal ArticleDOI
TL;DR: It is proved that the probabilistic communication complexity of the identity function in a 3-computer model is O(√n), where n is the number of computers in the model and √n is the population of computers.
Abstract: It is proved that the probabilistic communication complexity of the identity function in a 3-computer model isO(√n).

62 citations

Journal ArticleDOI
TL;DR: In this paper, a study aimed at modelling the mode choice behavior of commuters in Delhi by considering Random Forrest (RF) Decision Tree (DT) method, which is one of the most efficient DT methods for solving classification problems.
Abstract: Mode choice analysis forms an integral part of transportation planning process as it gives a complete insight to the mode choice preferences of the commuters and is also used as an instrument for evaluation of introduction of new transport systems. Mode choice analysis involves the procedure to study the factors in decision making process of the commuter while choosing the mode that renders highest utility to them. This study aims at modelling the mode choice behaviour of commuters in Delhi by considering Random Forrest (RF) Decision Tree (DT) method. The random forest model is one of the most efficient DT methods for solving classification problems. For the purpose of model development, about 5000 stratified household samples were collected in Delhi through household interview survey. A comparative evaluation has been carried out between traditional Multinomial logit (MNL) model and Decision tree model to demonstrate the suitableness of RF models in mode choice modelling. From the result, it was observed that model developed by Random Forrest based DT model is the superior one with higher prediction accuracy (98.96%) than the Logit model prediction accuracy (77.31%).

61 citations

Proceedings ArticleDOI
16 Jul 2006
TL;DR: This is the first algorithm that can learn arbitrary monotone Boolean functions to high accuracy, using random examples only, in time polynomial in a reasonable measure of the complexity of f.
Abstract: We give an algorithm that learns any monotone Boolean function f: {-1, 1}/sup n/ /spl rarr/ {-1, 1} to any constant accuracy, under the uniform distribution, in time polynomial in n and in the decision tree size of f. This is the first algorithm that can learn arbitrary monotone Boolean functions to high accuracy, using random examples only, in time polynomial in a reasonable measure of the complexity of f. A key ingredient of the result is a new bound showing that the average sensitivity of any monotone function computed by a decision tree of size s must be at most /spl radic/(log s). This bound has already proved to be of independent utility in the study of decision tree complexity (Schramm et al., 2005). We generalize the basic inequality and learning result described above in various ways; specifically, to partition size (a stronger complexity measure than decision tree size), p-biased measures over the Boolean cube (rather than just the uniform distribution), and real-valued (rather than just Boolean-valued) functions.

60 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
80% related
Artificial neural network
207K papers, 4.5M citations
78% related
Fuzzy logic
151.2K papers, 2.3M citations
77% related
The Internet
213.2K papers, 3.8M citations
77% related
Deep learning
79.8K papers, 2.1M citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202310
202224
2021101
2020163
2019158
2018121