Random Forests
Citations
2,616 citations
Cites methods from "Random Forests"
...130. rforest R creates a random forest (Breiman, 2001) ensemble, using the R function randomForest in the randomForest package, with parameters ntree = 500 (number of trees in the forest) and mtry= √ #inputs....
[...]
2,610 citations
Cites methods from "Random Forests"
...Using simulated and nine microarray data sets we show that random forest has comparable performance to other classification methods, including DLDA, KNN, and SVM, and that the new gene selection procedure yields very small sets of genes (often smaller than alternative methods) while preserving…...
[...]
2,507 citations
2,466 citations
Cites background from "Random Forests"
...Results: We identify two mechanisms responsible for this finding: (i) A preference for the selection of correlated predictors in the tree building process and (ii) an additional advantage for correlated predictor variables induced by the unconditional permutation scheme that is employed in the…...
[...]
2,450 citations
References
[...]
16,118 citations
7,601 citations
"Random Forests" refers background or methods in this paper
...But none of these these three forests do as well as Adaboost (Freund & Schapire, 1996) or other algorithms that work by adaptive reweighting (arcing) of the training set (see Breiman, 1998b; Dieterrich, 1998; Bauer & Kohavi, 1999)....
[...]
...In its original version, Adaboost (Freund & Schapire, 1996) is a deterministic algorithm that selects the weights on the training set for input to the next classifier based on the misclassifications in the previous classifiers....
[...]
5,984 citations
"Random Forests" refers background in this paper
...Ho (1998) has written a number of papers on “the random subspace” method which does a random selection of a subset of features to use to grow each tree....
[...]
...Keywords: classification, regression, ensemble...
[...]
2,919 citations
2,686 citations
"Random Forests" refers background or methods in this paper
...But none of these these three forests do as well as Adaboost (Freund & Schapire, 1996) or other algorithms that work by adaptive reweighting (arcing) of the training set (see Breiman, 1998b; Dieterrich, 1998; Bauer & Kohavi, 1999)....
[...]
...The second is that bagging can be used to give ongoing estimates of the generalization error (PE∗) of the combined ensemble of trees, as well as estimates for the strength and correlation....
[...]