scispace - formally typeset
Search or ask a question

Showing papers on "Dimensionality reduction published in 1977"


Journal ArticleDOI
TL;DR: The correspondence discusses the relationship of the discriminant vector method of feature selection and the method of Kittler and Young, showing that the latter method is, from the point of view of dimensionality reduction, more powerful and also computationally more efficient.
Abstract: The correspondence discusses the relationship of the discriminant vector method of feature selection [1] and the method of Kittler and Young [5]. Although both methods determine the feature space coordinate axes by maximizing the generalized Fisher criterion of discriminatory power, with the exception of two class case the resulting feature spaces are considerably different because of the difference in the constraints imposed on the axes by individual methods. It is shown that the latter method is, from the point of view of dimensionality reduction, more powerful and also computationally more efficient.

22 citations


Journal ArticleDOI
01 Oct 1977
TL;DR: Its abilities to learn normal patterns and to recognize deviations from these patterns were evaluated by experiments at the Oak Ridge National Laboratory (ORNL) High-Flux Isotope Reactor.
Abstract: A multivariate statistical pattern recognition system for reactor noise analysis is presented. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, updating, and dimensionality reduction capabilities. System design emphasizes control of the false-alarm rate. Its abilities to learn normal patterns and to recognize deviations from these patterns were evaluated by experiments at the Oak Ridge National Laboratory (ORNL) High-Flux Isotope Reactor. Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were readily detected by the pattern recognition system.

22 citations



Journal ArticleDOI
TL;DR: A minimum-error linear transformation was applied to reduce the dimensionality of reactor noise signatures, which improves the computational efficiency of processing algorithms and decreases bulk data storage requirements.
Abstract: A minimum-error linear transformation was applied to reduce the dimensionality of reactor noise signatures. The procedure improves the computational efficiency of processing algorithms and decreases bulk data storage requirements. The method was tested with noise signatures from the High-Flux Isotope Reactor at the Oak Ridge National Laboratory, and a dimensionality reduction in excess of 90 percent was achieved without loss in the average characteristics of the signatures.

5 citations


Journal ArticleDOI
TL;DR: It was found that classification using the optimal single linear feature yielded a value for the probability of misclassification on the order of 30% less than that obtained by using the best single untransformed feature.
Abstract: A computational algorithm is presented for the extraction of an optimal single linear feature from several Gaussian pattern classes. The algorithm minimizes the increase in the probability of misclassification in the transformed (feature) space. The general approach used in this procedure was developed in a recent paper by R. J. P. de Figueiredo.(1) Numerical results on the application of this procedure to the remotely sensed data from the Purdue C1 flight line as well asLandsat data are presented. It was found that classification using the optimal single linear feature yielded a value for the probability of misclassification on the order of 30% less than that obtained by using the best single untransformed feature. The optimal single linear feature gave performance results comparable to those obtained by using the two features which maximized the average divergence. Also discussed are improvements in classification results using this method when the size of the training set is small.

2 citations