scispace - formally typeset
Journal ArticleDOI

Nonparametric Discriminant Analysis

Reads0
Chats0
TLDR
In this article, a nonparametric method of discriminant analysis is proposed based on non-parametric extensions of commonly used scatter matrices for non-Gaussian data sets and a procedure is proposed to test the structural similarity of two distributions.
Abstract
A nonparametric method of discriminant analysis is proposed. It is based on nonparametric extensions of commonly used scatter matrices. Two advantages result from the use of the proposed nonparametric scatter matrices. First, they are generally of full rank. This provides the ability to specify the number of extracted features desired. This is in contrast to parametric discriminant analysis, which for an L class problem typically can determine at most L 1 features. Second, the nonparametric nature of the scatter matrices allows the procedure to work well even for non-Gaussian data sets. Using the same basic framework, a procedure is proposed to test the structural similarity of two distributions. The procedure works in high-dimensional space. It specifies a linear decomposition of the original data space in which a relative indication of dissimilarity along each new basis vector is provided. The nonparametric scatter matrices are also used to derive a clustering procedure, which is recognized as a k-nearest neighbor version of the nonparametric valley seeking algorithm. The form which results provides a unified view of the parametric nearest mean reclassification algorithm and the nonparametric valley seeking algorithm.

read more

Citations
More filters
Journal ArticleDOI

Nonparametric weighted feature extraction for classification

TL;DR: The new method provides greater weight to samples near the expected decision boundary, which tends to provide for increased classification accuracy and to reduce the effect of the singularity problem.
Journal ArticleDOI

Feature Mining for Hyperspectral Image Classification

TL;DR: An overview of both conventional and advanced feature reduction methods, with details on a few techniques that are commonly used for analysis of hyperspectral data.
Patent

Automotive occupant sensor system and method of operation by sensor fusion

TL;DR: In this article, the presence, position and type classification of an occupant in a passenger seat of a vehicle, as well as for sensing the presence of a rear-facing child seat therein, for use in controlling a related air bag activator control system to enable, disable or control inflation rate or amount of inflation of an air bag.
Proceedings ArticleDOI

Automatic Eye Detection and Its Validation

TL;DR: The impact of eye locations on face recognition accuracy is studied, and an automatic technique for eye detection is introduced, and the face recognition performance is shown to be comparable to that of using manually given eye positions.
Journal ArticleDOI

Maximum likelihood estimation of a multi‐dimensional log‐concave density

TL;DR: In this article, the kernel estimator is used in conjunction with the expectation-maximization algorithm to fit finite mixtures of log-concave densities, which is shown to have smaller mean integrated squared error compared with kernel-based methods.
References
More filters
Journal ArticleDOI

The estimation of the gradient of a density function, with applications in pattern recognition

TL;DR: Applications of gradient estimation to pattern recognition are presented using clustering and intrinsic dimensionality problems, with the ultimate goal of providing further understanding of these problems in terms of density gradients.
Journal ArticleDOI

An Optimal Set of Discriminant Vectors

TL;DR: A new method for the extraction of features in a two-class pattern recognition problem is derived that is based entirely upon discrimination or separability as opposed to the more common approach of fitting.
Journal ArticleDOI

The optimal distance measure for nearest neighbor classification

TL;DR: A local distance measure is shown to optimize the performance of the nearest neighbor two-class classifier for a finite number of samples using the difference between the finite sample error and the asymptotic error as the criterion of improvement.
Journal ArticleDOI

k-nearest-neighbor Bayes-risk estimation

TL;DR: Nonparametric estimation of the Bayes risk R^\ast using a k -nearest-neighbor ( k -NN) approach is investigated and the mean-squared error of the conditional Bayes error estimate is reduced significantly.