scispace - formally typeset
Search or ask a question

Showing papers on "Feature selection published in 1981"


Journal ArticleDOI
TL;DR: This is the first part of a two-part article presenting a statistical approach to the sensitivity analysis of computer models.
Abstract: This is the first part of a two-part article presenting a statistical approach to the sensitivity analysis of computer models. Part I defines the objectives of sensitivity analysis and presents a computer model that is used for purposes for illustration..

538 citations


Journal ArticleDOI
TL;DR: This correspondence considers the extraction of features as a task of linear transformation of an initial pattern space into a new space, optimal with respect to discriminating the data.
Abstract: This correspondence considers the extraction of features as a task of linear transformation of an initial pattern space into a new space, optimal with respect to discriminating the data. A solution of the feature extraction problem is given for two multivariate normal distributed pattern classes using an extended Fisher criterion as the distance measure. The introduced distance measure consists of two terms. The first term estimates the distance between classes upon the difference of mean vectors of classes and the second one upon the difference of class covariance matrices. The proposed method is compared to some of the more popular alternative methods: Fukunaga-Koontz method and Foley-Sammon method.

84 citations


Journal ArticleDOI
TL;DR: The feature selection procedure of ALLOC is compared with the SELECT procedure in the ARTHUR software package and with a procedure based on statistical tests in the SPSS software package, finding that the ALLOC selection procedure performs very well in the two applications considered here.

23 citations


Journal ArticleDOI
TL;DR: The computer program INDEP-SELECT has been developed for selection of an optimal subset from a set of possibly informative diagnostic or prognostic variables, and is equally useful for other discriminant analysis or pattern recognition problems involving variable selection.

21 citations



Journal ArticleDOI
TL;DR: Two important problems in the analysis of categorical questionnaire data are considered: assessment of question worth and variable selection, and discrete discriminant analysis when the data is nonordinal with many states and few respondents.
Abstract: Two important problems in the analysis of categorical questionnaire data are considered: assessment of question worth and variable selection, and discrete discriminant analysis when the data is nonordinal with many states and few respondents. The unifying approach used throughout is the concept of information theoretic distance measures. Simulations and applications to real data are presented.

8 citations


01 Dec 1981
TL;DR: An algorithm is proposed which predicts the optimal features at every node in a binary tree procedure which estimates the probability of error by approximating the area under the likelihood ratio function for two classes and taking into account the number of training samples used in estimating each of these two classes.
Abstract: An algorithm is proposed which predicts the optimal features at every node in a binary tree procedure. The algorithm estimates the probability of error by approximating the area under the likelihood ratio function for two classes and taking into account the number of training samples used in estimating each of these two classes. Some results on feature selection techniques, particularly in the presence of a very limited set of training samples, are presented. Results comparing probabilities of error predicted by the proposed algorithm as a function of dimensionality as compared to experimental observations are shown for aircraft and LANDSAT data. Results are obtained for both real and simulated data. Finally, two binary tree examples which use the algorithm are presented to illustrate the usefulness of the procedure.

7 citations


Proceedings ArticleDOI
01 Apr 1981
TL;DR: The results indicate that the parameters comprising the optimal set chosen are speaker-dependedt, and a technique using dynamic programming was used to select a subset of k best features among the entire set N.
Abstract: The main objective of this work was to investigate the effectiveness of long-term averages of the orthogonal linear prediction parameters in text-independent speaker recognition. To investigate the possibility of feature selection, a technique using dynamic programming (1) was used to select a subset of k best features among the entire set N. The results indicate that the parameters comprising the optimal set chosen are speaker-dependedt. Verification accuracies of 96.5% were obtained using the selected optimal 8- parameter (out of 12) feature set for each speaker in a verification scheme, in which the reference parameters were generated from 100 seconds of time-spaced voiced speech and the test parameters were generated from 5 seconds of voiced speech.

7 citations


Journal ArticleDOI
TL;DR: In this article, a variable sclection method is introduced in factor analysis from the viewpoint to preserve the configuration of factor scores in p ( = the number of factors) dimensional Kuclidean space.
Abstract: A new variable sclection method is introduced in factor analysis from the viewpoint to preserve the configuration of factor scores in p ( = the number of factors) dimensional Kuclidean space. Its performance is studied numerically in comparison with four conventional methods and it is shown that the proposed method is superior to the other four methods.

7 citations


Journal ArticleDOI
TL;DR: A satisfactory and consistent overall classification accuracy was achieved by using the sequential selection algorithms for selecting continuous features by maximizing the Mahalanobis distance at each step of the feature selection process.

7 citations


Journal ArticleDOI
01 Apr 1981
TL;DR: A nonparametric feature selection method which can be applicable to pattern recognition problems based on mixed features is presented and can select a feature subset based on higher order discriminating information.
Abstract: A nonparametric feature selection method which can be applicable to pattern recognition problems based on mixed features is presented. In the pattern space, each pattern class is represented by multiple subregions according to local interclass structure. Then in each of the subregions, feature selection is performed in a simple nonparametric way. Our feature selection method can select a feature subset based on higher order discriminating information. Some basic properties of our approach are presented theoretically and experimentally.

C. B. Chittineni1
01 Jan 1981
TL;DR: In this article, a set of bandpass filters are proposed as feature extracters for inspection of web-type products, and the required characteristics of the filters are determined through digital simulation, using feature selection methods.
Abstract: Details of a system designed to inspect web-type products are presented. The sensing device used is a laser scanner, and a brief description of the scanner is given. A set of bandpass filters is proposed as feature extracters. The required characteristics of the filters are determined through digital simulation, using feature selection methods. The linear classifier is designed from a set of training signals. Contextual information is used in the classification of signals into good product or into various defective categories. In addition, results of a study on inspection of magnetic tapes and abrasive sheets are described.

Journal ArticleDOI
TL;DR: In this article, a Bayesian approach to variable selection is used that includes an additional subset of variables for future classification if the additional measurement costs for this subsst are lower than the resulting reduction in expected misclassification costs.
Abstract: In discriminant analysis it is often desirable to find a small subset of the variables that were measured on the individuals of known origin, to be used for classifying individuals of unknown origin. In this paper a Bayesian approach to variable selection is used that includes an additional subset of variables for future classification if the additional measurement costs for this subsst are lower than the resulting reduction in expected misclassification costs.

Journal ArticleDOI
TL;DR: Numerical values for the eigenvalues of the matrix SW W −1 S B (within-class and between-class scatter matrices) are investigated and an analytic expression for their minimum value representing the minimum effectiveness is derived.

Journal ArticleDOI
TL;DR: The triangulation method, recently proposed in the cluster analysis literature for mapping points from l-space to 2-space, is used to yield a simple and efficient algorithm for feature selection by interactive clustering.

Proceedings ArticleDOI
B.V. Dasarathy1
05 Apr 1981
TL;DR: The proposed approach is the development of a set of target range and orientation independent features descriptive of the target geometries underlying the sensed point ensembles, which facilitates clustering of like targets and the corresponding point ensemble in the multidimensional feature space wherein each ensemble is represented by a single point.
Abstract: Recognition of targets characterized by point ensembles, for example, a set of FLIR sensed hot spots or radar detected reflectors, represents the topic of this study. B asic to the proposed approach is the development of a set of target range and orientation independent features descriptive of the target geometries underlying the sensed point ensembles. This facilitates clustering of like targets and the corresponding point ensembles in the multidimensional feature space wherein each ensemble is represented by a single point, thereby leading to clusters of like ensembles. This then permits deployment of traditional pattern recognition tools for identification of unknown targets. Details of the feature set selection process and test implementation results are presented to bring out the scope and potential of the new methodology developed in this study. with little or no details offered openly, the excuse being the obvious defense application potential. The ensuing sections present our approach to this feature selection problem and the experimental e vidence acquired which supports the methodology developed here.

Journal ArticleDOI
TL;DR: A technique for the selection of the best set of test features for checkout or go-no-go test of a complex electro-hydraulic servo system from input-output measurements is presented.