scispace - formally typeset
Search or ask a question

Showing papers by "Azriel Rosenfeld published in 1976"


Book
01 Jan 1976
TL;DR: The rapid rate at which the field of digital picture processing has grown in the past five years had necessitated extensive revisions and the introduction of topics not found in the original edition.
Abstract: The rapid rate at which the field of digital picture processing has grown in the past five years had necessitated extensive revisions and the introduction of topics not found in the original edition.

4,231 citations


Journal ArticleDOI
01 Jun 1976
TL;DR: This paper formulates the ambiguity-reduction process in terms of iterated parallel operations (i.e., relaxation operations) performed on an array of object, identification data.
Abstract: Given a set of objects in a scene whose identifications are ambiguous, it is often possible to use relationships among the objects to reduce or eliminate the ambiguity. A striking example of this approach was given by Waltz [13]. This paper formulates the ambiguity-reduction process in terms of iterated parallel operations (i.e., relaxation operations) performed on an array of (object, identification) data. Several different models of the process are developed, convergence properties of these models are established, and simple examples are given.

1,513 citations


Journal ArticleDOI
01 Apr 1976
TL;DR: In this paper, three standard approaches to automatic texture classification make use of features based on the Fourier power spectrum, on second-order gray level statistics, and on first-order statistics of gray level differences, respectively.
Abstract: Three standard approaches to automatic texture classification make use of features based on the Fourier power spectrum, on second-order gray level statistics, and on first-order statistics of gray level differences, respectively. Feature sets of these types, all designed analogously, were used to classify two sets of terrain samples. It was found that the Fourier features generally performed more poorly, while the other feature sets all performned comparably.

1,379 citations


BookDOI
01 Jan 1976
TL;DR: Automatic remote sensor image processing based on radiographic image analysis for high energy physics and digital picture analysis in cytology.
Abstract: Automatic remote sensor image processing.- On radiographic image analysis.- Image processing in high energy physics.- Digital picture analysis in cytology.- Picture analysis in character recognition.

118 citations


Journal ArticleDOI
01 Mar 1976
TL;DR: A method of detecting ``noisy'' branches of a skeleton, with the aid of the distance transform of the original object, is developed.
Abstract: Implementation of a simple algorithm is described for thinning digital objects down to ``skeletons.'' A method of detecting ``noisy'' branches of a skeleton, with the aid of the distance transform of the original object, is developed.

59 citations


Journal ArticleDOI
TL;DR: A cost function is developed, based on information-theoretic concepts, that measures the complexity of a stochastic context-free grammar, as well as the discrepancy between its language and a given stoChastic language sample.

54 citations


Journal ArticleDOI
TL;DR: A large set of texture features were measured for a collection of samples of an industrial material which had been graded in quality by inspectors and features were identified, based on gray-level co-occurrence statistics, that had a high correlation with the grades.

49 citations


Journal ArticleDOI
TL;DR: This note illustrates how one of Ohlander's scenes, a house, can be reasonably segmented by mapping it into a three-dimensional color space.
Abstract: Ohlander [1] has shown that a variety of scenes can be segmented into meaningful parts by histogramming the values of various point or local properties of the scene; extracting the region whose points gave rise to that peak; and repeating the process for the remainder of the scene. A generalization of this histogram analysis approach is to map the points of the scene into a multi-dimensional feature space, and to look for clusters in this space (a histogram is a mapping into a one-dimensional feature space, in which clusters are peaks). This note illustrates how one of Ohlander's scenes, a house, can be reasonably segmented by mapping it into a three-dimensional color space.

47 citations



Journal ArticleDOI
TL;DR: The development of techniques for the computer analysis of pictures and scenes began over 20 years ago and some of the major applications areas are automation (robot vision), cytology, radiology, high-energy physics, remote sensing, and document processing.
Abstract: The development of techniques for the computer analysis of pictures and scenes began over 20 years ago. Most of the work in this field has been application-oriented; some of the major applications areas are automation (robot vision), cytology, radiology, high-energy physics, remote sensing, and document processing (character recognition). However, many of the techniques and algorithms, even if developed for a particular application, are also applicable in other areas.

20 citations


Journal ArticleDOI
TL;DR: This note discusses some of these inequivalence results, and points out a relationship between local picture processing operations and a special class of finite-state languages, the strictly locally testable languages.
Abstract: Finite-state two-dimensional languages are much less well-behaved than their one-dimensional counterparts. Various definitions of language classes that are equivalent for strings are inequivalent for arrays. This note discusses some of these inequivalence results, and also points out a relationship between local picture processing operations and a special class of finite-state languages, the strictly locally testable languages.

Journal ArticleDOI
TL;DR: Topics covered include transforms and filtering; compression; enhancement, restoration, and reconstruction; hardware and software implementations; pictorial pattern recognition; matching and local feature detection; segmentation; shape and texture; scene analysis; and formal models.

Journal ArticleDOI
01 Feb 1976
TL;DR: Several models for line detection in noise are discussed, based on models for the ``simple'' and ``complex'' cells that have been found in the visual cortex.
Abstract: Several models for line detection in noise are discussed, based on models for the ``simple'' and ``complex'' cells that have been found in the visual cortex. The possibility of obtaining experimental evidence for deciding among the models is briefly considered.

Journal ArticleDOI
TL;DR: Deterministic two-head automata are described that accept the languages a n b n (or, more generally, the strings on a,b having the same number of a 's as b 's); ww; ww R ; and a kn for any fixed k.


01 Jul 1976
TL;DR: This supplemental report investigates some variations on the basic method of generating skeleton representations of noisy, grayscale pictures, and shows show approximations to the picture can be constructed, given its skeleton.
Abstract: : In an earlier report, a new method of generating skeleton representations of noisy, grayscale pictures was described. This supplemental report investigates some variations on the basic method, and also shows show approximations to the picture can be constructed, given its skeleton. (Author)