scispace - formally typeset
Search or ask a question

Showing papers on "Image processing published in 1980"


Journal ArticleDOI
TL;DR: Experimental results show that in most cases the techniques developed in this paper are readily adaptable to real-time image processing.
Abstract: Computational techniques involving contrast enhancement and noise filtering on two-dimensional image arrays are developed based on their local mean and variance. These algorithms are nonrecursive and do not require the use of any kind of transform. They share the same characteristics in that each pixel is processed independently. Consequently, this approach has an obvious advantage when used in real-time digital image processing applications and where a parallel processor can be used. For both the additive and multiplicative cases, the a priori mean and variance of each pixel is derived from its local mean and variance. Then, the minimum mean-square error estimator in its simplest form is applied to obtain the noise filtering algorithms. For multiplicative noise a statistical optimal linear approximation is made. Experimental results show that such an assumption yields a very effective filtering algorithm. Examples on images containing 256 × 256 pixels are given. Results show that in most cases the techniques developed in this paper are readily adaptable to real-time image processing.

2,701 citations


Journal ArticleDOI
TL;DR: Two-dimensional image moments with respect to Zernike polynomials are defined, and it is shown how to construct an arbitrarily large number of independent, algebraic combinations of zernike moments that are invariant to image translation, orientation, and size as discussed by the authors.
Abstract: Two-dimensional image moments with respect to Zernike polynomials are defined, and it is shown how to construct an arbitrarily large number of independent, algebraic combinations of Zernike moments that are invariant to image translation, orientation, and size. This approach is contrasted with the usual method of moments. The general problem of two-dimensional pattern recognition and three-dimensional object recognition is discussed within this framework. A unique reconstruction of an image in either real space or Fourier space is given in terms of a finite number of moments. Examples of applications of the method are given. A coding scheme for image storage and retrieval is discussed.

2,362 citations


Journal ArticleDOI
01 Oct 1980

1,565 citations


ReportDOI
01 Jan 1980
TL;DR: In this article, texture energy is measured by filtering with small masks, typically 5x5, then with a moving-window average of the absolute image values, leading to a simple class of texture energy transforms, which perform better than any of the preceding methods.
Abstract: : The problem of image texture analysis is introduced, and existing approaches are surveyed. An empirical evaluation method is applied to two texture measurement systems, co-occurrence statistics and augmented correlation statistics. A spatial-statistical class of texture measures is then defined and evaluated. It leads to a simple class of texture energy transforms, which perform better than any of the preceding methods. These transforms are very fast, and can be made invariant to changes in luminance, contrast, and rotation without histogram equalization or other preprocessing. Texture energy is measured by filtering with small masks, typically 5x5, then with a moving-window average of the absolute image values. This method, similar to human visual processing, is appropriate for textures with short coherence length or correlation distance. The filter masks are integer-valued and separable, and can be implemented with one-dimensional or 3x3 convolutions. The averaging operation is also very fast, with computing time independent of window size. Texture energy planes may be linearly combined to form a smaller number of discriminant planes. These principal component planes seem to represent natural texture dimensions, and to be more reliable texture measures than the texture energy planes. Texture segmentation or classification may be accomplished using either texture energy or principal component planes as input. This study classified 15x15 blocks of eight natural textures. Accuracies of 72% were achieved with co- occurrence statistics, 65% with augmented correlation statistics, and 94% with texture energy statistics.

869 citations


Journal ArticleDOI
TL;DR: It is shown that, by an a priori maximation of an entropy determined a posteriori, a picture can successfully be thresholded into a two-level image.

689 citations


Proceedings ArticleDOI
Kenneth I. Laws1
23 Dec 1980
TL;DR: In this article, the texture energy approach requires only a few convolutions with small (typically 5x5) integer coefficient masks, followed by a moving-window absolute average operation.
Abstract: A method is presented for classifying each pixel of a textured image, and thus for segmenting the scene. The "texture energy" approach requires only a few convolutions with small (typically 5x5) integer coefficient masks, followed by a moving-window absolute average operation. Normalization by the local mean and standard deviation eliminates the need for histogram equalization. Rotation-invariance can also be achieved by using averages of the texture energy features. The convolution masks are separable, and can be implemented with 1-dimensional (vertical and horizontal) or multipass 3x3 convolutions. Special techniques permit rapid processing on general-purpose digital computers.

635 citations


Journal ArticleDOI
01 Jan 1980

478 citations


Journal ArticleDOI
01 Jan 1980

457 citations


Journal ArticleDOI
TL;DR: An iterative computer method that can be used to solve a number of problems in optics, including reconstruction of astronomical objects from stellar speckle interferometer data and spectrum shaping for computer-generated holograms to reduce quantization noise is discussed.
Abstract: This paper discusses an iterative computer method that can be used to solve a number of problems in optics. This method can be applied to two types of problems: (1) synthesis of a Fourier transform pair having desirable properties in both domains, and (2) reconstruction of an object when only partial information is available in any one domain. Illustrating the first type of problem, the method is applied to spectrum shaping for computer-generated holograms to reduce quantization noise. A problem of the second type is the reconstruction of astronomical objects from stellar speckle interferometer data. The solution of the latter problem will allow a great increase in resolution over what is ordinarily obtainable through a large telescope limited by atmospheric turbulence. Experimental results are shown. Other applications are mentioned briefly.

454 citations


Journal ArticleDOI
TL;DR: In this paper, the problem of determining the 3D model and movement of an object from a sequence of two-dimensional images is discussed, and a solution to this problem depends on solving a system of nonlinear equations using a modified least squared error method.
Abstract: Discusses the problem of determining the three-dimensional model and movement of an object from a sequence of two-dimensional images A solution to this problem depends on solving a system of nonlinear equations using a modified least-squared error method Two views of six points or three views of four points are needed to provide an overdetermined set of equations when the images are noisy It is shown, however, that this numerical method is not very accurate unless the images of considerably more points are used

362 citations


Journal ArticleDOI
TL;DR: It is possible to generate quantitative metabolic maps that display the distribution of actual rates of local glucose utilization throughout the entire central nervous system in regions as small as 100 μm or less.
Abstract: A computerized image processing system has been developed for quantitative analyses of autoradiographs obtained with the [14C]deoxyglucose method. The system is composed of standard, commercially available components and includes a scanning microdensitometer, computer, image memory and display system, and monochrome and color monitors. The associated computer programs are written in PASCAL. Autoradiographs are automatically scanned, and the optical density of each spot is digitized at a maximum resolution of 65,536 readings per 6.4 x 6.4 mm area and stored in memory. Images can be reconstructed from the data in memory, displayed on the monitors, and utilized for microdensitometric analyses or manipulated for image enhancement, enlargement, or weighted averaging of selected regions. The digitized data can also be utilized to solve the operational equation of the [14C]deoxyglucose method, and color-coded images of autoradiographs can be reconstructed so that each color represents a narrow range of the rate of glucose utilization. By means of this system, it is possible to generate quantitative metabolic maps that display the distribution of actual rates of local glucose utilization throughout the entire central nervous system in regions as small as 100 micrometers or less.

Proceedings ArticleDOI
01 Jul 1980
TL;DR: An interactive, dynamic map has been built using videodisc technology to engage the user in a simulated “drive” through an unfamiliar space, and to incorporate optical and electronic image processing to provide a more responsive, visually complete representation of an environment.
Abstract: An interactive, dynamic map has been built using videodisc technology to engage the user in a simulated “drive” through an unfamiliar space. The driver, or map reader, is presented with either sparsely sampled sequences of images taken by single frame cameras that replicate actual imagery from a space, or with computer synthesized replicas of those images. The reader may control the speed, route, angle of view and mode of presentation of this information and may thus tour the area. In addition, he may access spatially stored ancillary data stored in the buildings or in locales in the environment. This basic map is being enhanced to provide topographic views, and to incorporate optical and electronic image processing to provide a more responsive, visually complete representation of an environment.

Journal ArticleDOI
TL;DR: A recording ophthalmoscope which requires substantially less light and allows an inversion of the usual division of the pupil, and various manipulations of the image are described, some of which are uniquely possible with this system.
Abstract: We have designed a recording ophthalmoscope which requires substantially less light than conventional ophthalmoscopes or fundus cameras. A laser beam of <100-μW total power provides the flying spot on the subject’s retina, allowing an inversion of the usual division of the pupil: only the central half-millimeter is needed for illumination, and the remaining 50 mm2 are used for light collection. No optical image of the retina is formed, but a photomultiplier tube in a pupillary conjugate plane provides video signals to a TV monitor, where an image appears. A simple analysis explains the gain in sensitivity. Various manipulations of the image are described, some of which are uniquely possible with this system.

Journal ArticleDOI
TL;DR: In this paper, a nonlinear optical processor using a photorefractive medium Bi12SiO20 and demonstrated that it is capable of convolving and correlating objects with spatial information.
Abstract: We report the application of four‐wave mixing to real‐time image processing. We constructed a nonlinear optical processor using a photorefractive medium Bi12SiO20 and demonstrated that it is capable of convolving and correlating objects with spatial information.

Patent
07 Jul 1980
TL;DR: In this paper, a high-resolution image is subdivided into contiguous sub-images, each of which is minified before being projected upon an image sensor module, which avoids bars of blindness between the fields of view of the image sensor modules.
Abstract: An image sensor, suitable for resolving a high-resolution image, comprises an arrayed plurality of image sensor modules of moderate individual resolution. The high-resolution image is subdivided into contiguous sub-images, each of which is minified before being projected upon an image sensor module. This avoids bars of blindness between the fields of view of the image sensor modules, and also facilitates production and repair of the image sensor.

Journal ArticleDOI
TL;DR: Multispectral middle IR (8-13-microm) data were acquired with an aircraft scanner over Utah to allow geologic photointerpretation based on subtle variations in spectral emittance between rock types.
Abstract: Multispectral middle IR (8-13 microns) data were acquired with an aircraft scanner over Utah. Because these digital image data were dominated by temperature, all six channels were highly correlated. Extensive processing was required to allow geologic photointerpretation based on subtle variations in spectral emittance between rock types. After preliminary processing, ratio images were produced and color ratio composites created from these. Sensor calibration and an atmospheric model allowed determination of surface brightness, temperature, emittance, and color composite emittance images. The best separation of major rock types was achieved with a principal component transformation, followed by a Gaussian stretch, followed by an inverse transformation to the original axes.


Journal ArticleDOI
TL;DR: The functions of minπ and maxπ are introduced as the analogues of nearest neighbour “propagation” signals of binary images as well as extending some already well known binary processes into grey level algorithms.

BookDOI
01 Jan 1980
TL;DR: In this article, the authors proposed a method for image processing of regular biological structures based on the linear theory of image formation. But their method was not suitable for the reconstruction of 3D objects.
Abstract: 1. Image Processing Based on the Linear Theory of Image Formation.- 1.1 Transfer Functions.- 1.2 Transfer Functions with Partially Coherent Illumination.- 1.3 Practical Exploitation of the Linear Relationship.- 1.3.1 Measurement of the Microscope Operating Characteristics.- 1.3.2 On-Line Processing.- 1.3.3 Filtering and Reconstruction.- References.- 2. Recovery of Specimen Information for Strongly Scattering Objects.- 2.1 Image Formation and Interpretation.- 2.1.1 Recapitulation of Coherent Image Formation.- 2.1.2 Interpreting the Specimen Wave.- 2.1.3 Rendering Images Discrete.- 2.2 Methods Iterating the Linear Theory Solution.- 2.3 Methods Requiring No Special Apertures.- 2.3.1 The Data Used.- 2.3.2 The Iterative Transform Algorithm.- 2.3.3 Examples and Practical Applications.- 2.3.4 Uniqueness.- 2.3.5 Periodic Images and Complex Zeros.- 2.3.6 Other Methods of Analysis.- 2.3.7 Conclusions.- 2.4 Methods Using Half-Plane Apertures.- 2.4.1 Hilbert Transforms.- 2.4.2 Logarithmic Hilbert Transforms.- 2.4.3 Real Aperture Shapes.- 2.4.4 Dark-Field Conditions.- 2.5 Analytic Wave Functions and Complex Zeros.- 2.5.1 Zero-Distributions and Zero Flipping.- 2.5.2 An Example.- 2.5.3 Immediate Applications.- 2.5.4 Reformulation of Zero Flipping.- 2.5.5 Two-Dimensional Extensions.- 2.6 Holography.- 2.6.1 The Linear Case.- 2.6.2 The General Case.- 2.6.3 Some Particular Cases.- 2.6.4 Nonplanar Reference Waves.- 2.7 Ptychography and Related Methods.- 2.8 Bright-Field/Dark-Field Subtraction.- 2.9 Other Perspectives.- 2.9.1 Coherence.- 2.9.2 Inelastic Scattering.- 2.9.3 Recording Noise and Radiation Damage.- 2.9.4 Practical Details of Computer Processing.- 2.9.5 Other Constraints.- 2.10 Conclusions.- References.- 3. Computer Reconstruction of Regular Biological Objects.- 3.1 The Biological Object.- 3.1.1 General Remarks.- 3.1.2 Regular Biological Objects.- 3.1.3 Chemical and Physical Processing of the Object.- 3.1.4 Contrast in Bright-Field Images.- 3.1.5 Radiation Damage.- 3.2 Fourier Processing of Electron Micrographs.- 3.2.1 Quantization and Preprocessing.- 3.2.2 The Whittaker-Shannon Sampling Theorem.- 3.2.3 Fourier Transforms of Regular Objects.- 3.2.4 Processing of Two-Dimensional Structures with Translational Symmetry.- 3.2.5 Rotational Filtering.- 3.2.6 Three-Dimensional Reconstruction of Objects with Helical Symmetry.- 3.2.7 Three-Dimensional Reconstruction of Particles with Icosahedral Symmetry.- 3.3 Recent Applications to Image Processing of Regular Biological Structure.- 3.3.1 One-Dimensional Filtering: Tropomyosin Paracrystal Structure.- 3.3.2 The Structure of Polyheads.- 3.3.3 The Structure of Ribosomes.- 3.3.4 The Structure of the Purple Membrane.- 3.3.5 A Correction for Distorted Images.- 3.3.6 Rotational Filtering of Base Plates.- 3.3.7 The Structure of the Contractile Sheath from Bacteriophage Mu.- 3.3.8 The Three-Dimensional Structure of an Icosahedral Virus Particle.- 3.4. Outlook.- References.- 4. Three-Dimensional Structure Determination by Electron Microscopy (Nonperiodic Specimens).- 4.1 History and General Discussion of the Subject.- 4.2 The Fundamental Theoretical Background.- 4.2.1 The Use of a CTEM as a Diffractometer.- 4.2.2 The Description of Structures in Three-Dimensional Electron Microscopy.- 4.2.3 Two-Dimensional Reconstruction (Image Filtering).- 4.2.4 The Projection Theorem.- 4.3 The Problem of Reconstruction.- 4.3.1 The Whittaker-Shannon-Type Interpolation.- 4.3.2 An Alternative Way of Incorporating the Finite Body Concept.- 4.3.3 Back-Projection and Filtered Back-Projection.- a) Simple Back-Projection.- b) Filtered Back-Projection.- c) The Influence of the Reconstruction Body.- d) The Influence of Restricted Tilting Angle.- 4.3.4 Conical Tilting.- 4.3.5 Reconstruction by Series Expansion.- a) The Cormack Method.- b) Aliasing.- 4.3.6 Algebraic Reconstruction in Direct Space.- 4.3.7 Reconstruction of an "Infinite" Platelet with Restricted Tilting Angle.- a) One-Dimensional Whittaker-Shannon Treatment of Single-Axis Tilting.- b) Reconstruction by Interpolation in Projection Space.- c) The Partially Defined "Unit Cell".- 4.3.8 Determination of a Common Origin of the Projections.- 4.4 Aspects for the Future.- 4.4.1 The "Atom" Constraint.- 4.4.2 The Use of a STEM as a Diffractometer.- References.- 5. The Role of Correlation Techniques in Computer Image Processing.- 5.1 Correlation Functions.- 5.1.1 The Cross-Correlation Function.- 5.1.2 The Autocorrelation Function.- 5.1.3 Correlation and Similarity.- 5.2 Computation.- 5.3 Some Important Theorems.- 5.3.1 CCFs of Images Containing Signal and Noise.- 5.3.2 The CCF of Blurred Signals.- 5.3.3 Some Thoughts on Signal, Noise, and Correlation.- 5.4 Determination of Relative Positions.- 5.4.1 Translation.- 5.4.2 Alignment of Projections.- 5.4.3 Centering of a Centrosymmetric Particle.- 5.4.4 Determination of Relative Orientation.- 5.5 Matched Filtering.- 5.6 Characterization of Instrument Conditions.- 5.7 Signal-to-Noise Ratio Measurement.- 5.7.1 Theory.- 5.7.2 Measurement.- 5.7.3 Consequence for Phase Contrast Microscopy.- 5.7.4 Generalized Signal-to-Noise Ratio Measurement.- 5.8 Conclusions.- References.- 6. Holographic Methods in Electron Microscopy.- 6.1 Historical Background.- 6.2 Holographic Schemes.- 6.2.1 The Generalized Hologram.- 6.2.2 In-Line Fresnel and Fraunhofer Holograms.- 6.2.3 Sideband Fresnel Holograms.- 6.2.4 Fourier Transform Holograms.- 6.2.5 Single-Sideband Holograms.- 6.2.6 Zone Plate Interpretation.- 6.3 Experimental Electron Holography.- 6.4 Contrast Transfer and Holography.- 6.4.1 In-Line Fresnel Hologram.- 6.4.2 Fresnel Sideband Hologram.- 6.4.3 Single-Sideband Hologram.- 6.4.4 The Effect of Partial Coherence on Resolution.- 6.5 Additional Reading.- 6.6 Conclusions.- References.- 7. Analog Computer Processing of Scanning Transmission Electron Microscope Images.- 7.1 Organization.- 7.2 Characteristics of Analog Processing.- 7.2.1 Grey Scale Modification.- 7.2.2 Filters.- 7.2.3 Signal Mixing.- 7.3 Types of Signals Available in the STEM.- 7.3.1 Basic Signals.- 7.3.2 Detected Signals.- 7.3.3 Extraction of Basic Signals.- 7.3.4 Normalization.- 7.4 Instrumental Characteristics.- 7.4.1 The Analog Processor.- 7.4.2 Display System.- 7.5 Applications.- 7.5.1 Basic Operations.- 7.5.2 Color Conversion Techniques.- 7.6 Conclusion.- References.- Appendix: Publication Details of International and European Congresses on Electron Microscopy.- Additional References with Titles.

Journal ArticleDOI
TL;DR: The expected images are calculated using two different four-wave mixing geometries, which show good agreement with the images that are experimentally observe using a single-domain crystal of BaTiO3 as the photorefractive material.
Abstract: Edge enhancement, a type of optical image processing, is performed in a photorefractive material in real time and with low incident-light intensities (10−3 W/cm2). We calculate the expected images using two different four-wave mixing geometries, which show good agreement with the images that we experimentally observe using a single-domain crystal of BaTiO3 as the photorefractive material.

Journal ArticleDOI
TL;DR: In this article, the brightest point in each image is shifted to the centre of image space and all images are superimposed, which is suitable for imaging faint astronomical objects with large optical telescopes.

Journal ArticleDOI
TL;DR: This survey presents the state-of-the-art in the texture analysis field and its relation to scientific fields like artificial intelligence and visual perception and it presents most of the relevant methods used today in texture analysis.

Journal ArticleDOI
TL;DR: A method of image enhancement by computer using the fuzzy set theoretic approach that involves extraction of fuzzy properties corresponding to pixels and then successive application of fuzzy operator `contrast intensification´ on the property plane is reported.
Abstract: A method of image enhancement by computer using the fuzzy set theoretic approach is reported. The algorithm involves extraction of fuzzy properties corresponding to pixels and then successive application of fuzzy operator `contrast intensification´ on the property plane. System performance with different indexes of fuzziness is demonstrated for an English script input.

Journal ArticleDOI
TL;DR: An algorithm is presented for constructing a quadtree from the array representation of a binary image that examines each pixel in the image once and only once, and never requires temporary nodes.

Journal ArticleDOI
TL;DR: In this paper, a computerized method is described for calculating an image of the refractive index distribution in a plane bounded by two underground boreholes, with rays at numerous depths and angles to effectively cover the cross section between holes.
Abstract: A computerized method is described for calculating an image of the refractive index distribution in a plane bounded by two underground boreholes. The scanning geometry is assumed to be limited to probing from borehole to borehole, with rays at numerous depths and angles to effectively cover the cross section between holes. A geometrical optics model is assumed for the transmission data. We stress situations where significant bending of electromagnetic or seismic rays occurs. Image reconstruction involves an iterated sequence of numerical ray tracing and linear system inversion. A similar approach, discussed in the literature, sometimes fails to converge. We report here our refinements of this method, including use of a smoothness constraint.

Journal ArticleDOI
TL;DR: The use of digital image processing techniques for electronic speckle pattern interferometry is discussed and some experimental verifications are presented in the cases of surface displacement and vibration amplitude measurements.
Abstract: The use of digital image processing techniques for electronic speckle pattern interferometry is discussed. A digital TV-image processing system with a large frame memory allows us to perform precise and flexible operations such as subtraction, summation, and level slicing. Digital image processing techniques make it easy compared with analog techniques to generate high contrast fringes. Some experimental verifications are presented in the cases of surface displacement and vibration amplitude measurements.

Proceedings ArticleDOI
06 May 1980
TL;DR: This paper presents a new architecture for image processing that consists of a pipeline of identical programmable serial processing stages, referred to as a cytocomputer, and shows generally to possess the advantages of lower complexity, high bandwidth and greater architectural flexibility.
Abstract: This paper presents a new architecture for image processing. It consists of a pipeline of identical programmable serial processing stages, referred to as a cytocomputer. Comparisons are made between cytocomputer and parallel array systems. Cytocomputer systems are shown generally to possess the advantages of lower complexity, high bandwidth and greater architectural flexibility. A first generation system is described and examples of processing are illustrated. Finally, current development efforts are described.

Journal ArticleDOI
TL;DR: This work presents a method for deriving depth information from a moving image where the camera is moving through a real world scene by refines a simple surface model based on error measures that are derived by interimage comparisons of point values.
Abstract: Presents a method for deriving depth information from a moving image where the camera is moving through a real world scene. The method refines a simple surface model based on error measures that are derived by interimage comparisons of point values.

Journal ArticleDOI
TL;DR: Experimental results in the three cases of slope of normal displacement, surface strain, and slope of vibration amplitude measurements are presented and Interpretation leads to conditions of maximum fringe contrast and the limitation of this technique.
Abstract: An application of digital image processing techniques to speckle-shearing interferometry is described. A present system consists of an image-shearing camera using a Fresnel biprism of small angles and a digital TV-image processing facility. This interferometer makes it easy to measure in quasi-real-time spatial derivatives of surface displacement and modal vibration amplitude of objects. A statistical theory is applied to analyze the formations of these fringes due to 3-D displacement. Interpretation of the result leads to conditions of maximum fringe contrast and the limitation of this technique. Experimental results in the three cases of slope of normal displacement, surface strain, and slope of vibration amplitude measurements are presented.

Journal ArticleDOI
R.C. Agarwal1
01 Oct 1980