scispace - formally typeset
Search or ask a question

Showing papers on "Dimensionality reduction published in 1985"


Journal ArticleDOI
TL;DR: The new discriminant analysis with orthonormal coordinate axes of the feature space is proposed, which is more powerful than the traditional one in so far as the discriminatory power and the mean error probability for coordinate axes are concerned.

153 citations


Proceedings ArticleDOI
12 Jun 1985
TL;DR: Diffraction pattern sampling provides a feature space suitable for object classification, orientation and inspection that allows significant dimensionality reduction and can be realized with considerable flexibility, reduced size and improved performance by the use of computer generated holograms.
Abstract: Diffraction pattern sampling provides a feature space suitable for object classification, orientation and inspection. It allows significant dimensionality reduction. These properties are best achieved by the use of specifically-shaped Fourier transform plane detector elements and this can be realized with considerable flexibility, reduced size and improved performance by the use of computer generated holograms.

29 citations


01 Nov 1985
TL;DR: In this article, Fisher's linear discriminant was combined with the Fukunaga-Koontz transform to give a useful technique for reduction f feature space from many to two or three dimensions.
Abstract: : This Memorandum describes how Fisher's Linear Discriminant can be combined with the Fukunaga-Koontz transform to give a useful technique for reduction f feature space from many to two or three dimensions. Performance is seen to be superior in general to the Foley-Sammon extension to fisher's method. The technique is then extended to show how a new radius vector (or pair of radius vectors) can be combined with fisher's vector to produce a classifier with even more power of discrimination. Illustrations of the technique show that good discrimination can be obtained even if there is considerable overlap of classes in any single projection. Keywords include: Index Terms; Dimensionality reduction, Discriminant vectors, Feature selection, Fisher criterion, Linear transformations, Separability. (Great Britain)

27 citations


Journal ArticleDOI
TL;DR: A dimension reduction method proposed by Odell (1979) and Decell, Odell, and Coberly (1981) for Gaussian models is extended to a general class of density functions known as @q-generalized normal densities.

5 citations


Journal Article
TL;DR: The Euclidean distances as a criterion in feature space for pattern recognition are presented and it can be proved that the coordinates of dimension reduction space that is eigenvectors with maximum eigenvalues of intraset covariance matrix of patterns, is not suitable for pattern classification.
Abstract: The Euclidean distances as a criterion in feature space for pattern recognition are presented. The results are satisfied in classification if the intraset Euclidean distance is minimized and the interset Euclidean distance is maximized. It can be proved that the coordinates of dimension reduction space that is eigenvectors with maximum eigenvalues of intraset covariance matrix of patterns, usually named K-L expansion feature selection method, is not suitable for pattern classification. The three feature selection method: minimum intraset Euclidean distance feature selection, maximum interset Euclidean distance feature selection and compound Euclidean distance feature selection are proposed to reduce the dimensionality of measurement vectors by the view of Euclidean distance, orthogonal transformations and quadratic optimization. The Euclidean feature selection is explict as well as plain. The other various feature selection methods based on the orthogonal transformation can be unified in conception by the method presented in this paper.

2 citations


Journal ArticleDOI
TL;DR: This paper describes the algorithm implementing the Linear Dependency Analysis procedure, which assesses the existence of linear dependencies in a multivariate data matrix X and examines the potential for partitioning the R. X p matrix X into the n X p1 matrix X1 containing the predictor variables and the n x p2 matrix X2 containing the estimated variables.
Abstract: The importance of detecting statistical dependencies in multivariate data has been discussed many times (e.g., Belsley et al. [l]). Recently, Kane et al. [5] have developed a procedure called Linear Dependency Analysis (LDA), which assesses the existence of linear dependencies in a multivariate data matrix X. This paper describes the algorithm implementing the LDA procedure. A brief description of some of the statistical and linear algebra theory behind the procedure is given below for notational purposes. Kane et al. [5] should be consulted for additional details and discussion of the procedure’s theoretical foundations. LDA examines the potential for partitioning the R. X p matrix X into the n X p1 matrix X1 containing the predictor variables and the n x p2 matrix X2 containing the estimated variables, and the appropriateness of using the linear relationship

1 citations