scispace - formally typeset
Search or ask a question

Showing papers on "Feature selection published in 1979"


Journal ArticleDOI
TL;DR: It is shown that Bayesian probability measures, which are used for feature selection, and are based on distance measures and information measures, are basically of two types, which clarifies some properties of these measures for the two-class problem and for the multiclass problem.

60 citations


Journal ArticleDOI
TL;DR: This paper selectively surveys contributions in linear feature selection which have been developed for the analysis of multipass LANDSAT data in conjunction with the Large Area Crop Inventory Experiment.

21 citations


Journal ArticleDOI
TL;DR: In this article, a method for feature selection from infrared spectra, intended for identification of organic compounds by computer-aided retrieval of reference data contained in small files, is described, which takes into account the a priori probabilities of spectral features and their correlations.

14 citations


Journal ArticleDOI
TL;DR: A new method for linear feature selection which has as its underlying theme the preservation of actual distances between training data points in the lower dimensional space and places the method closer to the principle components or Karhunen- Loeve approach than to methods based on an approach through statistical pattern recognition.

10 citations


Journal ArticleDOI
TL;DR: A hybrid version of the slant (ST) and Haar Transforms (HT) called slant-Haar transform (SHT) is developed and its utility and effectiveness are compared with other discrete transforms in applications such as feature selection, Wiener filtering and data compression.
Abstract: A hybrid version of the slant (ST) and Haar Transforms (HT) called slant-Haar transform (SHT) is developed. Its properties and fast algorithms are also developed. Based on some standard performance criteria its utility and effectiveness are compared with other discrete transforms in applications such as feature selection, Wiener filtering and data compression.

8 citations


Journal ArticleDOI
TL;DR: A simple variable selection procedure, similar in nature to the “F-to-enter” criterion used in stepwise multiple regression, suitable for multidimensional contingency tables having one criterion variable is discussed.
Abstract: This study discusses a simple variable selection procedure, similar in nature to the “F-to-enter” criterion used in stepwise multiple regression, suitable for multidimensional contingency tables having one criterion variable. The procedure is applied to an illustrative set of marketing data and contrasted with other, better-known methods.

7 citations


Journal ArticleDOI
TL;DR: The authors compared alternative methods of variable selection using data from a real estate assessment project and found that the most expensive and time consuming method, all possible regression, does not yield the best predictions and the variables selected judgmentally produced predictions that were not appreciably different from the other methods.
Abstract: This article compares alternative methods of variable selection using data from a real estate assessment project Various stepwise methods are compared with the all possible regression solution and a variable set selected on a priori considerations The methods are evaluated by the predictive performance of the estimated regression equation in a holdout sample One of the major conclusions of the study is that the most expensive and time consuming method, all possible regression, does not yield the best predictions In addition, the variables selected judgmentally produced predictions that were not appreciably different from the other methods

6 citations


Journal ArticleDOI
TL;DR: It is shown that for the widest possible class of approximants, the criterion reduces to Devijer's Bayesian distance.

5 citations


Journal ArticleDOI
TL;DR: In this article, two variable selection procedures are evaluated for classification problems: a forward stepwise discrimination procedure, and a stepwise procedure preceded by a preliminary screening of variables on the basis of individual t statistics.
Abstract: Two variable selection procedures are evaluated for classification problems: a forward stepwise discrimination procedure, and a stepwise procedure preceded by a preliminary screening of variables on the basis of individual t statistics. Expected probability of correct classification is used as the measure of performance. A comparison is made of the procedures using samples from multi-variate normal populations and from several nonnormal populations. The study demonstrated some situations where the use of all variables is preferable to the use of a stepwise discriminant procedure stopping after a few steps, though usually the latter procedure was superior in performance. However where the stepwise procedure performed better than using all variables, the modified stepwise procedure performed still better. The use of modified stepwise procedures in which not all the covariances of the problem need be estimated seems promising.

3 citations


01 Sep 1979
TL;DR: The author has identified the following significant results: total computer time to conduct a complete pattern recognition process is reduced, Subjective as well as statistically derived information is made available to the analyst/user much earlier in the analysis process.
Abstract: The author has identified the following significant results. The minimum entropy model may provide several useful advantages over traditional techniques for processing LANDSAT data. Total computer time to conduct a complete pattern recognition process is reduced. Subjective (transformed image), as well as statistically derived information is made available to the analyst/user much earlier in the analysis process. A rapid feedback loop in which numerous training set combinations can be tested for difference and representativeness is available. Additional tests of LANDSAT data processing using the minimum entropy model are clearly justified.

1 citations


01 May 1979
TL;DR: In this article, a stepwise procedure based on information theoretic considerations is proposed to select a best subset of variables which incorporates as much information for discriminating as possible, when obtaining and processing the numerous variables is expensive.
Abstract: : Often the scientist is faced with a large number of categorical variates which are of potential use in discriminating between two pregiven groups of objects. For example, an investor may wish to assign a particular firm to one of two possible risk groups based upon certain known characteristics of the firm (liquid to fixed asset ratio, etc.), or an engineer might wish to determine which of two models best describes a particular situation based upon the observed characteristics of situation. This is the general problem of variable selection in discriminant analysis. When obtaining and processing the numerous variables is expensive, one must select a best subset of variables which incorporates as much information for discriminating as possible. If time is also a factor, a stepwise procedure is mandated. We propose such a stepwise procedure here based upon information theoretic considerations. (Author)