scispace - formally typeset
Journal ArticleDOI

Bias of Apparent Error Rate in Discriminant-Analysis

Geoffrey J. McLachlan
- 01 Jan 1976 - 
- Vol. 63, Iss: 2, pp 239-244
Reads0
Chats0
About
This article is published in Biometrika.The article was published on 1976-01-01. It has received 9 citations till now. The article focuses on the topics: Linear discriminant analysis & Word error rate.

read more

Citations
More filters
Book

A Probabilistic Theory of Pattern Recognition

TL;DR: The Bayes Error and Vapnik-Chervonenkis theory are applied as guide for empirical classifier selection on the basis of explicit specification and explicit enforcement of the maximum likelihood principle.

Wrappers for Performance Enhancements and Oblivious Decision Graphs.

TL;DR: This doctoral dissertation concludes that repeated runs of five-fold cross-validation give a good tradeoff between bias and variance for the problem of model selection used in later chapters.
Journal ArticleDOI

Automatic pattern recognition: a study of the probability of error

TL;DR: The Vapnik-Chervonenkis method can be used to choose the smoothing parameter in kernel-based rules, to choose k in the k-nearest neighbor rule, and to choose between parametric and nonparametric rules.
Journal ArticleDOI

An automated method to analyze language use in patients with schizophrenia and their first-degree relatives

TL;DR: An automated and objective approach to modeling discourse that detects very subtle deviations between probands, their first-degree relatives and unrelated healthy controls is presented.
Journal ArticleDOI

Model selection for linear classifiers using Bayesian error estimation

TL;DR: The model selection by the new Bayesian error estimator is experimentally shown to improve the classification accuracy, especially in small sample-size situations, and is able to avoid the excess variability inherent to traditional cross-validation approaches.
References
More filters
Book

A Probabilistic Theory of Pattern Recognition

TL;DR: The Bayes Error and Vapnik-Chervonenkis theory are applied as guide for empirical classifier selection on the basis of explicit specification and explicit enforcement of the maximum likelihood principle.

Wrappers for Performance Enhancements and Oblivious Decision Graphs.

TL;DR: This doctoral dissertation concludes that repeated runs of five-fold cross-validation give a good tradeoff between bias and variance for the problem of model selection used in later chapters.
Journal ArticleDOI

Automatic pattern recognition: a study of the probability of error

TL;DR: The Vapnik-Chervonenkis method can be used to choose the smoothing parameter in kernel-based rules, to choose k in the k-nearest neighbor rule, and to choose between parametric and nonparametric rules.
Journal ArticleDOI

An automated method to analyze language use in patients with schizophrenia and their first-degree relatives

TL;DR: An automated and objective approach to modeling discourse that detects very subtle deviations between probands, their first-degree relatives and unrelated healthy controls is presented.
Journal ArticleDOI

Model selection for linear classifiers using Bayesian error estimation

TL;DR: The model selection by the new Bayesian error estimator is experimentally shown to improve the classification accuracy, especially in small sample-size situations, and is able to avoid the excess variability inherent to traditional cross-validation approaches.