scispace - formally typeset
Search or ask a question

Showing papers on "Feature vector published in 1972"


Book ChapterDOI
C.W. Swonger1
01 Jan 1972
TL;DR: An iterative algorithm for selecting a “consistent subset” of samples for use in a condensed nearest-neighbor decision rule is described, designed to provide several practical advantages over those previously reported.
Abstract: In this paper, an iterative algorithm for selecting a “consistent subset” (in the terminology of Hart) of samples for use in a condensed nearest-neighbor decision rule is described. This algorithm is designed to provide several practical advantages over those previously reported. Results are presented of applying the sample set condensation algorithm to a sample set of 275 dimensional binary feature vector samples for mixed-quality mixed-font alphanumeric characters from mail addresses and other machine-imprinted material. The recognition performance experimentally obtained using the algorithm is presented and its behavior is discussed.

48 citations


Journal ArticleDOI
TL;DR: Feature extraction has been recognized as a useful technique for pattern recognition by constructing a mapping from the measurement space to a feature space by optimizing the parameters with respect to a separability criterion.
Abstract: Feature extraction has been recognized as a useful technique for pattern recognition. Feature extraction is accomplished by constructing a mapping from the measurement space to a feature space. Often, the mapping is chosen from an arbitrarily specified parametric family by optimizing the parameters with respect to a separability criterion.

41 citations


Book ChapterDOI
K.S. Fu1
01 Jan 1972
TL;DR: This chapter discusses syntactic pattern recognition and stochastic languages, with emphasis on the description of noisy and/or distorted patterns and the learning of grammar from the actual pattern samples.
Abstract: Publisher Summary This chapter discusses syntactic pattern recognition and stochastic languages. Emphasis is on the description of noisy and/or distorted patterns and the learning of grammar from the actual pattern samples. It has been demonstrated by several very simple examples and sometimes with rather heuristic justifications, that the use of probability information in syntactic pattern recognition would make the syntactic approach more flexible and attractive. It is expected that the use of probability information in syntactic analysis, would probably improve the efficiency and flexibility of the analysis procedure. The many different techniques used to solve pattern recognition problems may be grouped into two general approaches: (1) decision-theoretic or discriminant approach and (2) syntactic or linguistic approach. In the decision-theoretic approach, a set of characteristic measurements are extracted from the patterns; the recognition of each pattern, assignment to a pattern class is usually made by partitioning the feature space The area of syntactic pattern recognition, though very promising, is still in its infancy. Many problems, such as primitive selection, flexible, and powerful pattern description techniques, and efficient analysis and inference procedures still need to be solved.

38 citations



01 May 1972
TL;DR: Machine recognition of handprinted characters is accomplished on a reference alphabet consisting of the 47 characters of the FORTRAN Programming Language (alphanumeric +-=*/()$.
Abstract: : Machine recognition of handprinted characters is accomplished on a reference alphabet consisting of the 47 characters of the FORTRAN Programming Language (alphanumeric +-=*/()$., and blank). The system is comprised of an optical scanner reader (OSR) which digitizes data with a maximum vertical resolution of 48 lines, and a software measurement-decision system. The OSR operates at 10 characters/second in conjunction with an XDS 930 computer, and produces digitized characters on magnetic tape. Recognition is done at rates of 3.5 to 6 characters/second off-line via the measurement-decision system operating on a CDC 6400/6600 computer system. The measurement system generates a directed graph representation of each character and extracts a primary feature vector. This vector consists of cavity (both open and closed) and spur information. Other specialized measurements are made later in the recognition process under the control of a deterministic decision tree. (Author)

12 citations


Journal ArticleDOI
TL;DR: The problem of feature selection in multi-class pattern recognition is viewed as that of a mapping of vector samples from n-dimensional space to that in m- dimensional space (m
Abstract: The problem of feature selection in multi-class pattern recognition is viewed as that of a mapping of vector samples from n-dimensional space to that in m-dimensional space (m

5 citations