Using AUC and accuracy in evaluating learning algorithms
Citations
2,358 citations
2,228 citations
Cites background or methods from "Using AUC and accuracy in evaluatin..."
...[67] J. Huang and C. X. Ling, “Using AUC and accuracy in evaluating learning algorithms,” IEEE Trans....
[...]
...APPENDIX DETAILED RESULTS TABLE In this appendix, we present the AUC test results for all the algorithms in all data-sets....
[...]
...Before starting with the analysis, we show the overall train and test AUC results (± for standard deviation) in Table VI....
[...]
...The AUC measure is computed just by obtaining the area of the graphic: AUC = 1 + TPrate − FPrate 2 ....
[...]
...We have obtained the AUC metric estimates by means of a 5-fold cross-validation....
[...]
1,292 citations
Cites background or methods from "Using AUC and accuracy in evaluatin..."
...The Area Under the ROC Curve (AUC) [70] corresponds to the probability of correctly identifying which one of the two...
[...]
...Finally, with respect to the evaluation metric, we use the Area Under the ROC Curve (AUC) [19,70] as evaluation criteria....
[...]
1,084 citations
653 citations
References
40,826 citations
26,531 citations
"Using AUC and accuracy in evaluatin..." refers background in this paper
...5 to the recently developed SVM [4], [7], [36] on the data sets from the UCI repository....
[...]
21,674 citations
"Using AUC and accuracy in evaluatin..." refers methods in this paper
...In this paper, we establish formal criteria for comparing two different measures for learning algorithms and we show theoretically and empirically that AUC is a better measure (defined precisely) than accuracy....
[...]
19,398 citations