scispace - formally typeset
Search or ask a question

Showing papers on "Image quality published in 1979"


Journal ArticleDOI
TL;DR: The design and performance characteristics of ECAT-II, a second generation whole-body positron imaging system, providing high contrast, high resolution, quantitative images in two-dimensional and tomographic formats is presented in this paper.
Abstract: The design and performance characteristics of ECAT-II, a second generation whole-body positron imaging system, providing high contrast, high resolution, quantitative images in two-dimensional and tomographic formats is presented. Discussed are: (i) a description of ECAT-II; (ii) the design criteria and implementation for maximizing image quality by minimizing effects such as random coincidence and scattered radiation, and by additionally performing dynamic background correction; and (iii) phantom studies to illustrate the implemented system resolution, efficiency, linearity, and field uniformity.

82 citations


Journal ArticleDOI
TL;DR: A psychophysical approach to image quality evaluation is presented and the potential value of its potential value is discussed.
Abstract: Technical and diagnostic image quality are distinguished and the limitations of three methods currently used for assessing diagnostic image quality are discussed. They are evaluations based on individual clinical experience, measurement of diagnostic performance, and physical measurements made on images or imaging systems. Finally, a psychophysical approach to image quality evaluation is presented and its potential value discussed.

64 citations


Journal ArticleDOI
TL;DR: A theoretical upper bound for the energy contained in these aliasing artifacts is derived from a computer simulation study performed in which the transforms of undersampled projections were subtracted from the corresponding transforms when the projection data were taken with a very large number of rays.
Abstract: Streaking artifacts in tomographic images reconstructed by the filtered-backprojection algorithm are caused by aliasing errors in the projection data. To show this a computer simulation study was performed in which the transforms of undersampled projections were subtracted from the corresponding transforms when the projection data were taken with a very large number of rays. This yielded the aliased spectrum for the undersampled case. An image was reconstructed from the difference transforms. Streaks present in this image exactly matched those present in the undersampled reconstruction. (The number of projections used in this study was large enough to preclude any artifacts caused by their insufficient number.) We have derived a theoretical upper bound for the energy contained in these aliasing artifacts. In this paper we have also briefly touched upon the artifacts caused by other algorithmic aspects of a tomographic system.

55 citations


Journal ArticleDOI
Robert J. Noll1
TL;DR: The influence of optical fabrication surface errors on the performance of an optical system is discussed in this paper, where a review of the basic concepts of image quality, an examination of manufacturing errors as a function of image Quality performance, a demonstration of mirror scattering effects in relation to surface errors, and some comments on the nature of the correlation functions are included.
Abstract: In many of today's telescopes the effects of surface errors on image quality and scattered light are very important. The influence of optical fabrication surface errors on the performance of an optical system is discussed. The methods developed by Hopkins (1957) for aberration tolerancing and Barakat (1972) for random wavefront errors are extended to the examination of mid- and high-spatial frequency surface errors. The discussion covers a review of the basic concepts of image quality, an examination of manufacturing errors as a function of image quality performance, a demonstration of mirror scattering effects in relation to surface errors, and some comments on the nature of the correlation functions. Illustrative examples are included.

52 citations



Journal ArticleDOI
TL;DR: The streak-like pattern in the image noise due to the anisotropic nature of the noise cross-correlation function is discussed and how these nonlinear phenomena affect noise filtering and tissue characterization using statistical parameters is discussed.
Abstract: The variance of the image noise in computed X-ray transmission tomography (CT) due to quantum noise is in a first approximation a nonlinear function of X-ray attenuation. Beam hardening in CT is also a nonlinear function of attenuation. We present a theoretical study of both phenomena. Computer simulations and numerical results show that both nonlinear dependencies have quite similar effects on image quality. We also show how the two-dimensional distribution of the noise variance in a CT image is a weighted superposition of images obtained by backprojecting integer powers of the noiseless projection data corresponding to the scanned object. The streak-like pattern in the image noise due to the anisotropic nature of the noise cross-correlation function is discussed. We also discuss how these nonlinear phenomena affect noise filtering and tissue characterization using statistical parameters.

35 citations


Journal ArticleDOI
TL;DR: A computer-aided approach to the design of dither signals that uses a pairwise exchange algorithm to minimize a measure of susceptibility to aliasing is described.
Abstract: The quality of binary images displayed using ordered dither is strongly dependent on the spatial arrangement of the thresholds in the dither signal. In the frequency domain, this dependence may be viewed as a consequence of aliasing. A computer-aided approach to the design of dither signals that uses a pairwise exchange algorithm to minimize a measure of susceptibility to aliasing is described. The susceptibility measure may be chosen to take into account many factors that affect perceived image quality. Experimental results are presented that illustrate the potential of the method.

34 citations


Patent
13 Dec 1979
TL;DR: In this article, a mapping function is obtained by maximizing a composite quality factor comprising one or more Subjective Quality Factors representing blur, smear, and exposure, evaluated at the subject distance and at a plurality of background distances.
Abstract: Automatic control apparatus for a photographic camera is responsive to a set of measured scene parameter inputs, representing at least subject distance and subject brightness, to produce signals representing the desired setting of one or more of the following camera control functions: aperture size, shutter time, flash output, and lens extension. The desired camera settings are determined by a mapping function defining the settings that optimize picture quality by minimizing the total reduction in picture quality due to the combined effects of blur, smear and exposure error. The mapping function is obtained by maximizing a Composite Quality Factor comprising one or more Subjective Quality Factors representing blur, smear, and exposure, evaluated at the subject distance and at a plurality of background distances.

30 citations


Journal ArticleDOI
TL;DR: Although the incoherent processing technique that is proposed is effective only in a 1-D operation, it also works for 2-D image addition and subtraction because it use of white light processing.
Abstract: A technique of incoherent image addition and subtraction is described. The basic advantage of this technique is its use of white light processing, in which case the unavoidable artifact noise in the coherent optical processor may be removed. Although the incoherent processing technique that we have proposed is effective only in a 1-D operation, it also works for 2-D image addition and subtraction.

29 citations


Journal ArticleDOI
TL;DR: The present study devoted to the statistical analysis of edges in still monochrome TV pictures, which concerns orientation, edge length, edge width, runlength between edges and edge slope probability distributions as well as the measure of orientation continuity along an edge and the relative frequencies of edge pixels and contrasted isolated pixels.
Abstract: The present study is devoted to the statistical analysis of edges in still monochrome TV pictures. The visual information carried by the edges is especially important both for image interpretation and for subjective image quality evaluation. Statistical knowledge on edges is helpful to improve image coding techniques significantly as well as processing techniques for scene analysis. After an introduction on nonstationary local statistical models, we describe the parameters of edges and the methods used to measure them. Statistical data collected on these parameters are then presented. The data concern orientation, edge length, edge width, runlength between edges and edge slope probability distributions as well as the measure of orientation continuity along an edge and the relative frequencies of edge pixels and contrasted isolated pixels.

26 citations


Journal ArticleDOI
TL;DR: Discusses the image compression technology where the aim is to narrow the transmitted band-width as much as possible to reduce the cost of transmission and for military use it reduces the susceptibility to interference.
Abstract: Discusses the image compression technology where the aim is to narrow the transmitted band-width as much as possible. Such compression reduces the cost of transmission and for military use it reduces the susceptibility to interference, but the problem is that the higher the compression, the greater the loss of image quality.

Journal ArticleDOI
TL;DR: A class of image representations that are appropriate for display of continuous-tone imagery with a wide range of digital devices is analyzed and Quantization and Fourier domain aliasing are shown to be important factors in the degradations introduced by the nonlinear display process.
Abstract: A class of image representations that are appropriate for display of continuous-tone imagery with a wide range of digital devices is analyzed. Special cases include pulse-amplitude modulation, ordered dither, and pulse-surface-area modulation. Quantization and Fourier domain aliasing are shown to be important factors in the degradations introduced by the nonlinear display process. The severity of these degradations may be minimized by proper design of the dot profile. Using a simple measure of image quality, the required information density and display resolution are calculated as a function of the number of display luminance levels for a well-known dot profile. Some experimental results are also presented.

Journal ArticleDOI
TL;DR: An anthropomorphic chest phantom with realistic disease simulation is described with a review of previously available chest phantoms and the need for subjective image analysis is discussed and compared with the existing physics parameters as a means of evaluating image quality in chest radiology.
Abstract: An anthropomorphic chest phantom with realistic disease simulation is described with a review of previously available chest phantoms. The need for subjective image analysis is discussed and compared with the existing physics parameters as a means of evaluating image quality in chest radiology. An in

01 Jan 1979
TL;DR: Application of BARC image data compression to the Galileo orbiter mission of Jupiter is considered and it is noted that the compressor can also be operated as a floating rate noiseless coder by simply not altering the input data quantization.
Abstract: A block adaptive rate controlled (BARC) image data compression algorithm is described. It is noted that in the algorithm's principal rate controlled mode, image lines can be coded at selected rates by combining practical universal noiseless coding techniques with block adaptive adjustments in linear quantization. Compression of any source data at chosen rates of 3.0 bits/sample and above can be expected to yield visual image quality with imperceptible degradation. Exact reconstruction will be obtained if the one-dimensional difference entropy is below the selected compression rate. It is noted that the compressor can also be operated as a floating rate noiseless coder by simply not altering the input data quantization. Here, the universal noiseless coder ensures that the code rate is always close to the entropy. Application of BARC image data compression to the Galileo orbiter mission of Jupiter is considered.

Journal ArticleDOI
TL;DR: The centroid, variance, eccentricity, and angle of the principal axes are suggested as pertinent parameters, and an image chopping technique for fast measurement of these parameters is described.
Abstract: In several applications it is necessary to get a fast evaluation of the shape and quality of an image of a point source. Since a complete description by means of the point spread function involves time-consuming data processing, it is necessary to limit the evaluation to a set of parameters which can be measured at high rates. Such a set can be defined in terms of the first and second moments of the distribution. In this article the centroid, variance, eccentricity, and angle of the principal axes are suggested as pertinent parameters, and an image chopping technique for fast measurement of these parameters is described.

Patent
11 Sep 1979
TL;DR: In this paper, a TV camera is used for quality control of metals and analysis of radiological images, which is processed by computer using a multiprocessor system and the spaced and image quality is increased by a multi-processor system.
Abstract: In a number of activities such as quality control of metals and analysis of radiological images a TV camera obtains a moving image which is processed by computer. The spaced and image quality is increased by a multiprocessor system. The output from a TV camera (1) is applied to a monitor (3) and tape recorder (4) and also to an A/D converter (2). The digitised signal is routed via a megabus (18) to a D/A converter(s) for application to a screen (6) for inspection. The digital signal is also applied to microprocessor units (7-14) controlled by a microprocessor control unit (15) with a mass memory system (16). This ensures control and processing taks are performed and also allows interactive operation via a teletype writer (17).

Proceedings ArticleDOI
02 Nov 1979
TL;DR: The development of a quality measure is illustrated with a series of studies using the optical power spectrum and relationships with information extraction performance are used to evaluate the effectiveness of the resulting quality measures.
Abstract: The emphasis in this paper is on frequency domain measures of image quality. The development of a quality measure is illustrated with a series of studies using the optical power spectrum. Relationships with information extraction performance are used to evaluate the effectiveness of the resulting quality measures. The studies cover several image types: conventional photography, CRT display and digitized imagery. The status of physical quality measures, as tools for display development and evaluation, is assessed.

Journal ArticleDOI
01 Jun 1979
TL;DR: A new technique for image compression and/or enhancement is presented that comprises dividing the two-dimensional spectrum into low- and high-frequency components and digitizing the latter with a tapered, randomized quantizer.
Abstract: A new technique for image compression and/or enhancement is presented. The method comprises dividing the two-dimensional spectrum into low- and high-frequency components and digitizing the latter with a tapered, randomized quantizer. A version of the system in which the highs component is adaptively adjusted gives improved picture quality.

Patent
04 Sep 1979
TL;DR: In this paper, an improved electrophotographic copying apparatus is described, which permits, during automatic copying of a desired number of copies from the same original document, the setting of the number of repeated uses of the same latent electrostatic image, thus obviating the lowering and unevenness of image quality of the copies.
Abstract: An improved electrophotographic copying apparatus is described, which permits, during automatic copying of a desired number of copies from the same original document, the setting of the number of repeated uses of the same latent electrostatic image, thus obviating the lowering and unevenness of image quality of the copies. In another embodiment of the improved electrophotographic copying apparatus, the timing of the formation of successive latent images of the same original document is controlled in accordance with the total number of copies to be made of the same original document so that the number of repeated uses of each respective latent image is made approximately equal. Also, in the latter embodiment, a signal indicating the timing for the exchange of successive original documents is generated at the time of formation of the last electrostatic image from the prior original document.

Proceedings ArticleDOI
28 Dec 1979
TL;DR: A new median filter implementation suitable for use on the video-rate "pipeline processors" provided by several commercially-available image display systems is described, which is faster than the best software implementations, depending on the median filter window size.
Abstract: The Tukey median filter is widely used in image processing for applications ranging from noise reduction to dropped line replacement. However, implementation of the median filter on a general-purpose computer tends to be computationally very time-consuming. This paper describes a new median filter implementation suitable for use on the video-rate "pipeline processors" provided by several commercially-available image display systems. The execution speed of the new implementation is faster than the best software implementations, depending on the median filter window size, by up to an order of magnitude. It is also independent of the image dimensions up to a 512x512 pixel size.

Journal ArticleDOI
TL;DR: It is shown that, in xeromammography with its inherently low-contrast structures, an optimal exposure exists which optimizes simultaneously all low-Contrast edges and this last finding, coupled with experimental results, suggest immediately the possibility of an automatic exposure termination in xingredients.
Abstract: This work undertakes a detailed system-based analysis of the xeromammographic process starting from basic considerations. Both the edge enhancement and wide-recording latitude, the two principal characteristics of xeroradiography, are shown to bear an intimate relationship to the electric-field distribution. Criteria and methods are formulated for optimizing xeromammographic image quality and a procedure is developed for calculating the white gap. Densitometric curves are derived for both positive- and negative-mode xeroradiography and found to be in excellent agreement with experiment. The question of image linearity is examined carefully and a threshold value of the electrostatic contrast is established, which sets a natural criterion for the application of Fourier analysis. Furthermore, it is shown that, in xeromammography with its inherently low-contrast structures, an optimal exposure exists which optimizes simultaneously all low-contrast edges. This last finding, coupled with experimental results, suggest immediately the possibility of an automatic exposure termination in xeromammography. Beam hardening is also investigated and it is shown that increased filtration combined with a lower bias potential leads to substantial dose reduction without significant loss of image quality. The paper concludes with a discussion of scattered radiation and how it affects xeromammographic image quality.

Patent
08 Mar 1979
TL;DR: In this article, the authors improve the picture quality by securing a scanning image conversion for the polar coordinate image into the right-angled coordinate image through the digital signal processing technique as well as ensuring generation of the picture elements between scanning lines via the interpolation.
Abstract: PURPOSE:To improve the picture quality by securing a scanning image conversion for the polar coordinate image into the right-angled coordinate image through the digital signal processing technique as well as ensuring generation of the picture elements between scanning lines via the interpolation.

Proceedings ArticleDOI
06 Jul 1979
TL;DR: It is described that a resolution of 2.5 to 3 line pairs is now available on the Siemen's Videomed H television monitor, which approximates that available on 100 mm film.
Abstract: The image quality available on photofluorographic spot films (70,100 or 105 mm) has been gradually improving as high resolution image intensification has evolved. This paper describes that a resolution of 2.5 to 3 line pairs is now available on the Siemen's Videomed H television monitor. Such resolution now approximates that available on 100 mm film. The authors have been using a 70 mm camera to record the television image and have also modified a multiformat camera to record images in fluoroscopic examinations. This development is described and illustrated using clinical examples. The reduction in radiation and film cost is emphasized.© (1979) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Journal ArticleDOI
TL;DR: The reconstructed images of incoherent objects show the usefulness of the device.
Abstract: The rotation and data extraction of a linear array image sensor is controlled by a minicomputer, so that the desired information about the optical field at the face of the sensor can be extracted effectively. To obtain operations for the data, three kinds of averagings are adopted, that is, (1) simple averaging of intensity fluctuation at each sampling point, (2) second-order cross-correlation of fluctuations of intensities at two different sampling points, and (3) triple correlation of intensity fluctuations at three different sampling points. They are used (a) for the derivation of images averaged over any desired time interval, (b) for the measurement of the modulus of coherence function, and (c) for the measurement of a complex coherence function by an intensity triple correlator aiming at the imaging of incoherent objects with general shapes and measuring the statistical characteristics of the turbulence. Typically the characteristics of the system are: (1) observation area: over a disk of 8-mm diam; (2) spatial sampling interval: 50 μm; (3) temporal sampling interval: 500 μsec; (4) coherence function over the observation area is derived within 5 min and stored in a minicomputer, (5) by applying the Fourier transform to the observed coherence function the image or coherence function of the object is obtained in 2 min and is also displayed as the output of the minicomputer. The reconstructed images of incoherent objects show the usefulness of the device.

Journal ArticleDOI
TL;DR: It was found that x-ray machine useful beams could be reliably measured using nonscreen films, and the main utility of the phantom was to identify cases of poor image quality.
Abstract: The design and implementation of a program to investigate remote quality assurance testing for film mammography is described. The measurements included tube output, x-ray machine and processor stability, and film image quality. Mammography phantoms and film sensitometric strips were distributed monthly to 24 regional hospitals. Most of the hospital processors and x-ray machines performed in a stable manner during the 12-month test period. It was found that x-ray machine useful beams could be reliably measured using nonscreen films. The main utility of the phantom was to identify cases of poor image quality. The measurements performed on the phantom image could not be used to diagnose specific causes of poor images.

Patent
22 Mar 1979
TL;DR: In this paper, the development bias voltage was changed in order to allow the same electric field to act for development of a static latent image and to obtain copies of uniform image quality throughout the whole number of copies.
Abstract: PURPOSE:To compensate deterioration of a static latent image and to obtain copies of uniform image quality throughout the whole number of copies, by changing development bias voltage in order to allow the same electric field to act for development.

Journal ArticleDOI
TL;DR: In this paper, a general theory of imaging a randomly aberrated wave is presented and a special emphasis is placed upon identifying parameters and characteristics necessary to simulate imaging through the turbulent atmosphere.
Abstract: A general theory of imaging a randomly aberrated wave is presented Special emphasis is placed upon identifying parameters and characteristics necessary to simulate imaging through the turbulent atmosphere Two components are found in the average image and optical transfer function Energy and intensity ratios are calculated Special cases studied include wave amplitude and phase fluctuations alone and in combination, image contrast, predetection compensation, and stellar speckle interferometry Observations of cores in star images and laboratory simulations give support to our calculation of an additional term in the long exposure image and optical transfer function

Journal ArticleDOI
TL;DR: The change in the depiction of objects with a very high attenuation difference in relation to its surroundings appears not to be linear, but proportional to the square root of the exposure.
Abstract: When the speed of a recording medium is doubled the background quantum mottle is increased by a factor square root 2. However, the signal/noise ratio is changed not in proportion to the square root of the exposure, but in a linear fashion, i.e. by a factor 2. The change in the depiction of objects with a very high attenuation difference in relation to its surroundings appears not to be linear, but proportional to the square root of the exposure. Such objects (metal wire meshes, lead bar grids) should thus be avoided in routine evaluation of image quality since they give incomplete information as to image impairment when high-speed recording media are used.


Journal Article
TL;DR: Pulmonary perfusion and liver-spleen images of excellent quality were obtained in the rabbit using the pinhole collimator and Ta-178-labeled agents, although the image quality is superior with Tc-99m under comparable conditions.
Abstract: Tantalum-178 is a short-lived radionuclide (T/sub 1/2/ = 9.3 min.) and emits primarily 56- to 64-keV characteristic x-rays. We have determined the imaging characteristics with this radionuclide and a large-field-of-view Anger camera. With a pinhole collimator, good spatial resolution is possible with Ta-178, although the image quality is superior with Tc-99m under comparable conditions. Spatial resolution with parallel-hole or converging collimators was much less satisfactory with Ta-178 because of septal penetration by high-energy photons. Pulmonary perfusion and liver-spleen images of excellent quality were obtained in the rabbit using the pinhole collimator and Ta-178-labeled agents.