Open Access
Classification and Regression by randomForest
Andy Liaw,Matthew C. Wiener +1 more
Reads0
Chats0
TLDR
random forests are proposed, which add an additional layer of randomness to bagging and are robust against overfitting, and the randomForest package provides an R interface to the Fortran programs by Breiman and Cutler.Abstract:
Recently there has been a lot of interest in “ensemble learning” — methods that generate many classifiers and aggregate their results. Two well-known methods are boosting (see, e.g., Shapire et al., 1998) and bagging Breiman (1996) of classification trees. In boosting, successive trees give extra weight to points incorrectly predicted by earlier predictors. In the end, a weighted vote is taken for prediction. In bagging, successive trees do not depend on earlier trees — each is independently constructed using a bootstrap sample of the data set. In the end, a simple majority vote is taken for prediction. Breiman (2001) proposed random forests, which add an additional layer of randomness to bagging. In addition to constructing each tree using a different bootstrap sample of the data, random forests change how the classification or regression trees are constructed. In standard trees, each node is split using the best split among all variables. In a random forest, each node is split using the best among a subset of predictors randomly chosen at that node. This somewhat counterintuitive strategy turns out to perform very well compared to many other classifiers, including discriminant analysis, support vector machines and neural networks, and is robust against overfitting (Breiman, 2001). In addition, it is very user-friendly in the sense that it has only two parameters (the number of variables in the random subset at each node and the number of trees in the forest), and is usually not very sensitive to their values. The randomForest package provides an R interface to the Fortran programs by Breiman and Cutler (available at http://www.stat.berkeley.edu/ users/breiman/). This article provides a brief introduction to the usage and features of the R functions.read more
Citations
More filters
Journal ArticleDOI
Machine Learning Prediction of Cancer Cell Sensitivity to Drugs Based on Genomic and Chemical Properties
Michael P. Menden,Francesco Iorio,Francesco Iorio,Mathew J. Garnett,Ultan McDermott,Cyril H. Benes,Pedro J. Ballester,Julio Saez-Rodriguez +7 more
TL;DR: Machine learning models are developed to predict the response of cancer cell lines to drug treatment, quantified through IC50 values, based on both the genomic features of the cell lines and the chemical properties of the considered drugs, providing a computational framework to identify new drug repositioning opportunities.
Journal ArticleDOI
Global patterns of terrestrial nitrogen and phosphorus limitation
Enzai Du,Enzai Du,César Terrer,César Terrer,Adam F. A. Pellegrini,Anders Ahlström,Anders Ahlström,Caspar J. van Lissa,Xia Zhao,Nan Xia,Xinhui Wu,Robert B. Jackson +11 more
TL;DR: In this article, the authors examined global N and P limitation using the ratio of site-averaged leaf resorption efficiencies of the dominant species across 171 sites and evaluated their predictions using a global database of N- and P-limitation experiments based on nutrient additions at 106 and 53 sites, respectively.
Journal ArticleDOI
A Deep CNN-LSTM Model for Particulate Matter (PM 2.5 ) Forecasting in Smart Cities.
Chiou-Jye Huang,Ping-Huan Kuo +1 more
TL;DR: A deep neural network model that integrates the CNN and LSTM architectures is developed, and through historical data such as cumulated hours of rain, cumulated wind speed and PM2.5 concentration, the forecasting accuracy of the proposed CNN-LSTM model (APNet) is verified to be the highest in this paper.
Journal ArticleDOI
Cancer of the esophagus and esophagogastric junction: data-driven staging for the seventh edition of the American Joint Committee on Cancer/International Union Against Cancer Cancer Staging Manuals.
TL;DR: AJCC/UICC stage groupings for esophageal cancer have not been data-driven or harmonized with stomach cancer as discussed by the authors, but they have been used to develop a data driven, harmonized esophagesageal staging for the seventh edition of the AJCC and UICC cancer staging manuals.
Journal ArticleDOI
Global diversity and biogeography of bacterial communities in wastewater treatment plants
Linwei Wu,Daliang Ning,Bing Zhang,Yong Li,Ping Zhang,Xiaoyu Shan,Qiuting Zhang,Mathew R. Brown,Zhenxin Li,Joy D. Van Nostrand,Fangqiong Ling,Naijia Xiao,Ya Zhang,Julia Vierheilig,George Wells,Yunfeng Yang,Ye Deng,Qichau Tu,Aijie Wang,Tong Zhang,Zhili He,Jurg Keller,Per Halkjær Nielsen,Pedro J. J. Alvarez,Craig S. Criddle,Michael Wagner,James M. Tiedje,Qiang He,Thomas P. Curtis,David A. Stahl,Lisa Alvarez-Cohen,Bruce E. Rittmann,Xianghua Wen,Jizhong Zhou,Jizhong Zhou +34 more
TL;DR: Global sampling of microbial communities associated with wastewater treatment plants and application of ecological theory revealed a small, core bacterial community associated with performance and provides insights into the community dynamics in this environment.
References
More filters
Modern Applied Statistics With S
TL;DR: The modern applied statistics with s is universally compatible with any devices to read, and is available in the digital library an online access to it is set as public so you can download it instantly.
Proceedings Article
Boosting the margin: A new explanation for the effectiveness of voting methods
TL;DR: In this paper, the authors show that the test error of the generated classifier usually does not increase as its size becomes very large, and often is observed to decrease even after the training error reaches zero.
Journal ArticleDOI
Estimating Generalization Error on Two-Class Datasets Using Out-of-Bag Estimates
TL;DR: For two-class datasets, a method for estimating the generalization error of a bag using out-of-bag estimates is provided and most of the bias is eliminated and accuracy is increased by incorporating a correction based on the distribution of the out- of-bag votes.