scispace - formally typeset
Search or ask a question

Showing papers on "Dimensionality reduction published in 1969"


Journal ArticleDOI
TL;DR: An algorithm for the analysis of multivariate data is presented along with some experimental results that is based upon a point mapping of N L-dimensional vectors from the L-space to a lower-dimensional space such that the inherent data "structure" is approximately preserved.
Abstract: An algorithm for the analysis of multivariate data is presented along with some experimental results. The algorithm is based upon a point mapping of N L-dimensional vectors from the L-space to a lower-dimensional space such that the inherent data "structure" is approximately preserved.

3,460 citations


Journal ArticleDOI
George Nagy1
TL;DR: A modified version of the Isodata or K-means clustering algorithm is applied to a set of patterns originally proposed by Block, Nilsson, and Duda, and to another artificial alphabet.
Abstract: The objects and methods of automatic feature extraction on binary patterns are briefly reviewed. An intuitive interpretation for geometric features is suggested whereby such a feature is conceived of as a cluster of component vectors in pattern space. A modified version of the Isodata or K-means clustering algorithm is applied to a set of patterns originally proposed by Block, Nilsson, and Duda, and to another artificial alphabet. Results are given in terms of a figure-of-merit which measures the deviation between the original patterns and the patterns reconstructed from the automatically derived feature set.

28 citations


Journal Article
TL;DR: The proposed algorithm has been applied in a decision making problem with the help of a numerical example and it is demonstrated that the proposed algorithm efficiently encounters the dimension reduction.
Abstract: Dimensionality reduction plays an effective role in downsizing the data having irregular factors and acquires an arrangement of important factors in the information. Sometimes, most of the attributes in the information are found to be correlated and hence redundant. The process of dimensionality reduction has a wider applicability in dealing with the decision making problems where a large number of factors are involved. To take care of the impreciseness in the decision making factors in terms of the Pythagorean fuzzy information which is in the form of soft matrix. The perception of the information has the parameters - degree of membership, degree of indeterminacy (neutral) and degree of nonmembership, for a broader coverage of the information. We first provided a technique for finding a threshold element and value for the information provided in the form of Pythagorean fuzzy soft matrix. Further, the proposed definitions of the object-oriented Pythagorean fuzzy soft matrix and the parameter-oriented Pythagorean fuzzy soft matrix have been utilized to outline an algorithm for the dimensionality reduction in the process of decision making. The proposed algorithm has been applied in a decision making problem with the help of a numerical example. A comparative analysis in contrast with the existing methodologies has also been presented with comparative remarks and additional advantages. The example clearly validates the contribution and demonstrates that the proposed algorithm efficiently encounters the dimension reduction. The proposed dimensionality reduction technique may further be applied in enhancing the performance of large scale image retrieval.

5 citations


Proceedings ArticleDOI
01 Nov 1969
TL;DR: In this paper, it is known that R linearly separable classes of multi-dimensional pattern vectors can always be represented in a feature space of at most R dimensions, and an approach is developed which can frequently be used to find a non-orthogonal transformation to project the patterns into a higher dimensionality feature space.
Abstract: It is known that R linearly separable classes of multi-dimensional pattern vectors can always be represented in a feature space of at most R dimensions. An approach is developed which can frequently be used to find a non-orthogonal transformation to project the patterns into a feature space of considerably lower dimensionality. Examples involving classification of handwritten and printed digits are used to illustrate the technique.

1 citations