Showing papers in "Pattern Recognition Letters in 1989"
••
TL;DR: The preliminary results suggest that GA is a powerful means of reducing the time for finding near-optimal subsets of features from large sets.
848 citations
••
TL;DR: A hierarchial image merging scheme based on a multiresolution contrast decomposition (the ratio of low-pass pyramid) to preserve those details from the input images that are most relevant to visual perception is introduced.
611 citations
••
TL;DR: A multiresolution image representation is presented in which iterative morphological filters of many scales but identical shape serve as basis functions and is well suited for VLSI implementation.
256 citations
••
TL;DR: It is shown that every fuzzy group can be imbedded in a fuzzy group of the group of automorphisms of some fuzzy graph.
141 citations
••
TL;DR: An efficient algorithm determining perspective structures such as vanishing points and horizon lines for indoor scenes for indoor Scenes as a hierarchical Hough transform on the pyramidal sphere is presented.
140 citations
••
TL;DR: An object is numerically retrieved from its hologram when the distance between them is unknown by using a Focus Measure, and the self-entropy of the magnitude and phase are both suitable measures.
102 citations
••
TL;DR: A new threshold selection technique using a correlation is presented, which selects an optimum threshold by maximizing the correlation between the original halftone image and the threshold bilevel image.
82 citations
••
TL;DR: Interval coding of binary images provides a representation in which the mathematical morphology operations of dilation and erosion by an arbitrary structuring element can be naturally and efficiently implemented on a serial computer.
81 citations
••
TL;DR: Here it is shown that a single plane in parameter space can be employed, with a consequent gain in efficiency, and is especially suitable for ellipses of low eccentricity.
80 citations
••
TL;DR: A method is described, based on clustering, for estimating the parameters of a finite mixture of normal distributions based on fuzzy hypervolume and density criteria, which incorporates unsupervised tracking of initial cluster centers during its first stage.
74 citations
••
TL;DR: A classification procedure is described that is based upon local band descriptors that shows the suitability of the local description method in its ability to visualize the image processing technique at the level of the chromosome image.
••
TL;DR: It is shown that the energy feature detector is a true projection and does not proliferate edges when applied to a line-drawing, whereas several of the conventional operators do.
••
TL;DR: A new concept of Floating Approximation is introduced based on Pawlaks theory of Rough Sets and the existence of ‘hidden attributes’ in knowledge representation systems and a simplified algorithm developed and implemented by the author is described.
••
TL;DR: Algorithms based on minimisation of fuzzy compactness are developed and it is possible to obtain automatically both fuzzy and nonfuzzy skeletons of an image.
••
TL;DR: It is shown that the entropy of a simple polygon is maximal if and only if it is convex and a similar but less computationally burdensome measure is also proffered.
••
TL;DR: A divide-and-conquer Hough transform technique for detecting a given number of straight edges or lines in an image that requires only O(log n) computational steps for an image of size n × n.
••
TL;DR: A topographic neural network model (Kohonen, 1984) may be used to data compress synthetic aperture radar (SAR) images by up to a factor of 8.
••
TL;DR: A planar shape normalization method to neutralize the effect of shape skewing by turning a perceived shape into its most compact form through linear transformations.
••
TL;DR: A problem of a normal mixture decomposition is discussed in terms of fuzzy classification variables, which are introduced as the class assignment probabilities, which allows to develop a formalism for the description of an intuitive notion of overlaps between classes.
••
TL;DR: The purpose of this paper is to investigate several methods of assigning class memberships to sets of vectors, and to examine the effects of initialization schemes in the fuzzy classifiers in these pattern classifiers.
••
TL;DR: Good pseudo-Euclidean distance transformations on hexagonal grids are derived using only local operations and preferably integer arithmetic.
••
TL;DR: A solution method, based on nonlinear programming techniques, is presented for the problem of matching two series of observations of a 2-D contour, in which each sampling of the contour has different hidden parts.
••
TL;DR: The Extended Safe Point Thinning Algorithm (ESPTA) is presented for thinning 3-D images and it is shown that ESPTA preserves the 18-connectivity of the image.
••
TL;DR: A binary Houghtransform derived from the conventional Hough transform with slope/ intercept parameterization and a systolic architecture for its efficient implementation using only adders and delay-elements is presented.
••
TL;DR: It is explained how fuzziness is incorporated in the structure of the net and its influence on matching process is clarified, and interesting aspects of matching and inverse matching procedures are underlined.
••
TL;DR: Experimental results show that this improved algorithm for extracting a simplified skeletal version of digital patterns compares favorably with the one proposed by Abdulla-Saleh-Morad.
••
TL;DR: It has been found that this approach reduces computer classification time at a reasonable expense of classification accuracy.
••
TL;DR: This paper presents a way to match polygon fragments using polygon moments and cross moments to compute a dissimilarity measure between two fragments, and finds the coordinate transform that maps one fragment onto the other.
••
TL;DR: Two optimal (Bayes) strategies for performing the classification at each nonterminal node are derived from a multistage classifier based on a decision tree scheme.
••
TL;DR: A controlled continuity constraint which provides a local spatial control over the smoothness of the solution enables the problem to be regularized while preserving the disparity discontinuities.