RUSBoost: A Hybrid Approach to Alleviating Class Imbalance
Citations
2,228 citations
Cites background or methods from "RUSBoost: A Hybrid Approach to Alle..."
...Likewise, UnderBagging is computationally harder than RUSBoost, in spite of obtaining comparable size trees, it uses four times more classifiers....
[...]
...The Boosting-based ensembles that are considered in our study are RUSBoost, SMOTEBoost and MSMOTEBoost....
[...]
...On the other hand, with regard to ensemble learning methods, a large number of different approaches have been proposed in the literature, including but not limited to SMOTEBoost [44], RUSBoost [45], IIVotes [46], EasyEnsemble [47], or SMOTEBagging [55]....
[...]
...Particularly noteworthy is the performance of RUSBoost, which is the computationally least complex among the best performers....
[...]
...have arisen as a possible solution to the class imbalance problem attracting great interest among researchers [45], [47], [50], [62]....
[...]
1,448 citations
1,292 citations
Cites background or methods from "RUSBoost: A Hybrid Approach to Alle..."
...In recent years, ensembles of classifiers have arisen as a possible solution to the class imbalance problem [77,85,112,117,127,131]....
[...]
...M1 (AdaB-M1) [110], AdaBoost with costs outside the exponent (AdaC2) [117], RUSBoost (RUSB) [112], SMOTEBagging (SBAG) [130], and EasyEnsemble (EASY) [85]....
[...]
...Ensemble methods [101,108] are also frequently adapted to imbalanced domains, either by modifying the ensemble learning algorithm at the data-level approach to preprocess the data before the learning stage of each classifier [17,30,112] or by embedding a cost-sensitive framework in the ensemble learning process [44,117,122]....
[...]
...Following this idea, many approaches have been developed by modifying the standard boosting weight-update mechanism in order to improve the performance on the minority class and the small disjuncts [30,44,61,69,74,112,117,122]....
[...]
...In this last section of the experimental analysis on the behavior of the methodologies for addressing classification with imbalanced datasets, we will perform a cross-family comparison for the approaches previously selected as the representatives for each case, namely preprocessing (SMOTE and SMOTE+ENN), cost-sensitive learning (CS-Weighted and MetaCost) and ensemble techniques (RUSB and SBAG)....
[...]
730 citations
Cites methods from "RUSBoost: A Hybrid Approach to Alle..."
...Namely, it was combined with boosting [Seiffert et al. 2010] and bagging [Chang et al. 2003; Tao et al. 2006; Wang and Yao 2009; Wallace et al. 2011] and was applied to both classes in random forests in a method named Balanced Random Forest (BRF) [Chen et al. 2004]....
[...]
487 citations
References
21,674 citations
20,196 citations
17,313 citations
12,940 citations
"RUSBoost: A Hybrid Approach to Alle..." refers methods in this paper
...The remaining data sets were obtained from the popular University of California–Irvine repository [ 25 ], and they represent various application domains....
[...]
11,512 citations