scispace - formally typeset
Journal ArticleDOI

On the Discriminant Vector Method of Feature Selection

Kittler
- 01 Jun 1977 - 
- Vol. 26, Iss: 6, pp 604-606
Reads0
Chats0
TLDR
The correspondence discusses the relationship of the discriminant vector method of feature selection and the method of Kittler and Young, showing that the latter method is, from the point of view of dimensionality reduction, more powerful and also computationally more efficient.
Abstract
The correspondence discusses the relationship of the discriminant vector method of feature selection [1] and the method of Kittler and Young [5]. Although both methods determine the feature space coordinate axes by maximizing the generalized Fisher criterion of discriminatory power, with the exception of two class case the resulting feature spaces are considerably different because of the difference in the constraints imposed on the axes by individual methods. It is shown that the latter method is, from the point of view of dimensionality reduction, more powerful and also computationally more efficient.

read more

Citations
More filters
Journal ArticleDOI

A generalized Foley-Sammon transform based on generalized fisher discriminant criterion and its application to face recognition

TL;DR: The conception of the generalized Fisher discriminant criterion is presented, the generalized Foley-Sammon transform (GFST) is proposed, and the experimental results show that present method is superior to the existing methods in terms of correct classification rate.
Journal ArticleDOI

A theorem on the uncorrelated optimal discriminant vectors

TL;DR: It is proved that the classical optimal discriminant vectors are equivalent to UODV, which can be used to extract (L−1) uncorrelated discriminant features for L-class problems without losing any discriminant information in the meaning of Fisher discriminant criterion function.
Journal ArticleDOI

A generalized optimal set of discriminant vectors

TL;DR: The experimental results show that the present method is superior to the Foley-Sammon method, the positive pseudoinverse method, and the matrix rank decomposition method in terms of correct classification rate.
Journal ArticleDOI

On an Extended Fisher Criterion for Feature Selection

TL;DR: This correspondence considers the extraction of features as a task of linear transformation of an initial pattern space into a new space, optimal with respect to discriminating the data.
Journal ArticleDOI

Recent Developments in Pattern Recognition

TL;DR: A very brief survey of recent developments in basic pattern recognition techniques is presented.
References
More filters
Journal ArticleDOI

Application of the Karhunen-Loève Expansion to Feature Selection and Ordering

TL;DR: A method is developed herein to use the Karhunen-Loeve expansion to extract features relevant to classification of a sample taken from one of two pattern classes.
Journal ArticleDOI

An Optimal Set of Discriminant Vectors

TL;DR: A new method for the extraction of features in a two-class pattern recognition problem is derived that is based entirely upon discrimination or separability as opposed to the more common approach of fitting.
Book

Knowing and guessing

Journal ArticleDOI

A new approach to feature selection based on the Karhunen-Loeve expansion

TL;DR: A new K-L technique is described that overcomes some of the limitations of the earlier procedures and is suggested that it is particularly useful for pattern recognition when combined with classification procedures based upon discriminant functions obtained by recursive least squares analysis.
Journal ArticleDOI

On the generalized Karhunen-Loeve expansion (Corresp.)

TL;DR: The purpose of this correspondence is to show that the cited optimum properties can be retained by defining a generalized Karhunen-Loeve expansion which considers the possibility of two or more stochastic processes generating the random functions.