scispace - formally typeset
Journal ArticleDOI

Application of state variable techniques to optimal feature extraction

D.G. Lainiotis, +1 more
- Vol. 56, Iss: 12, pp 2175-2176
TLDR
State variable techniques are utilized here to yield efficient computer-implementable procedures for obtaining the double orthogonal expansion of the observable random process under hypothesis H i, i = 1, 2.
Abstract
Optimal continuous linear feature extraction for the binary Gaussian pattern recognition or detection problem necessitates finding the double orthogonal expansion of the observable random process under hypothesis H i , i = 1, 2. State variable techniques are utilized here to yield efficient computer-implementable procedures for obtaining the double orthogonal expansion.

read more

Citations
More filters
Journal ArticleDOI

A class of upper bounds on probability of error for multihypotheses pattern recognition (Corresp.)

TL;DR: A class of upper bounds on the probability of error for the general multihypotheses pattern recognition problem is obtained and an upper bound is shown to be a linear functional of the pairwise Bhattacharya coefficients.
Journal ArticleDOI

Partitioned Ricatti solutions and integration-free doubling algorithms

TL;DR: Generalized partitioned solutions (GPS) of Riccati equations (RE) are presented in terms of forward and backward time differential equations that are theoretically interesting, possibly computationally advantageous, as well as provide interesting interpretations of these resuits.

Probability of Error Bounds

TL;DR: The bounds, applicable to general non-gaussian densities and especially mixture densities encountered in adaptive pattern recognition, are simple to calculate and hence valuable for on-line performance evaluation of pattern recognition system.
Proceedings ArticleDOI

A class of upper-bounds on probability of error for multi-hypotheses pattern recognition

TL;DR: A class of upper bounds on the probability of error for the general multihypotheses pattern recognition problem is obtained and an upper bound is shown to be a linear functional of the pairwise Bhattacharya coefficients.
Journal ArticleDOI

On a general relationship between estimation, detection, and the Bhattacharyya coefficient (Corresp.)

TL;DR: It is shown that a functional of the likelihood ratio necessary for Bayes optimal detection/pattern recognition is the Bayes-optimal mean-square estimate of the indicator variable \beta, where \beta = 1, 0, for hypotheses H_{1}, H_{0} , respectively.
References
More filters
Journal ArticleDOI

On the best finite set of linear observables for discriminating two Gaussian signals

TL;DR: The problem of discriminating two Gaussian signals by using only a finite number of linear observables is considered and it is found that the set of observables that minimizes H is a set of coefficients of the simultaneously orthogonal expansions of the two signals.
Related Papers (5)