scispace - formally typeset
Search or ask a question

Showing papers on "Image quality published in 1970"


Journal ArticleDOI
Dorian Kermisch1
TL;DR: A quantitative analysis of the effect on image reconstruction of discarding the amplitude information contained in a wavefront reflected by a diffusely reflecting, coherently illuminated surface is given.
Abstract: A quantitative analysis of the effect on image reconstruction of discarding the amplitude information contained in a wavefront reflected by a diffusely reflecting, coherently illuminated surface is given. The image reconstruction from a phase record alone is analyzed for the perfect and imperfect phase-matching cases.

78 citations


Journal ArticleDOI
TL;DR: This work examines degradations for the binary Fourier transform hologram and presents a method by which the plotting procedure may be designed so as to yield a most faithful reconstructed image.
Abstract: Generation of holograms by computer allows the possibility of better controlling the hologram formation process and of displaying a synthesized image in the case where the object does not exist physically. However, limitations of equipment used to plot the hologram can cause degradation of the reconstructed image. We examine these degradations for the binary Fourier transform hologram and present a method by which the plotting procedure may be designed so as to yield a most faithful reconstructed image. Experimental results which support the analysis are included.

55 citations


Journal ArticleDOI
TL;DR: Methods are being developed to study transfer of diagnostically important information by the radiological process via optical communication theory methods, and images of highest diagnostic quality best facilitate and least inhibit achievement of a one-to-one relationship between diagnosis given and actual status of the part examined.
Abstract: The central problem in the study of radiographic image quality is to gain knowledge of the effect of physical image quality on diagnosis. Methods are being developed to study transfer of diagnostically important information by the radiological process. Parts of this process are being analyzed via optical communication theory methods. Radiographic images can be assessed according to physical or diagnostic criteria. An image that reproduces the primary input most faithfully from a physical standpoint may not contain the most useful diagnostic information. Images of highest diagnostic quality best facilitate and least inhibit achievement of a one-to-one relationship between diagnosis given (output) and actual status of the part examined (input).

45 citations



Book ChapterDOI
TL;DR: In this paper, the spatial and time coherence of a thermal hologram was investigated and a general theory of the image formation for a hologram made by using a thermal source was derived.
Abstract: When a light source other than a laser is employed for making a hologram, a considerable reduction of the image quality results, owing to the small spatial and time coherence of the source. In 1967, Leith and Upatnieks made a hologram of very excellent quality with their achromatic fringe system (ref.1). They placed a photographic plate at the image plane of a grating and recorded a hologram between two beams of different orders of diffraction, in either of which was inserted the transparent object to be recorded. They introduced the time coherence requirement which made possible the application of a high-pressure mercury arc, but did not deal with the spatial coherence. Kato and Suzuki later extended their method to the spatial coherence problem and obtained some qualitative results which related the source size to the resolution limit of the reconstructed image(ref.2). These studies, however, were concerned mainly with their particular arrangements and no attempts were made to derive a general theory of the image formation for a hologram made by using a thermal source.

8 citations


Journal ArticleDOI
TL;DR: In this paper, the error in the estimation of the edge density trace and the line spread function was investigated for low-contrast edges, and an expression was derived for low contrast edges.
Abstract: The evaluation of image quality parameters directly from the image is often required when the imaging system point spread is not known. The derivation of the parameter values from microdensitometer scans across the images of inferred edges in the scene format is a popular technique. This paper presents the results of a study to determine the errors incurred and the simplification that results from making certain approximations in deriving the values of two particular parameters. The measures of image quality sought are "resolution", which is defined as the area under the transfer function, and "passband", which is the area under the square of the transfer function. An expression is derived for low contrast edges which provides a simple relation between the edge density trace and the line spread function which is independent of film gamma, and the error in resolution and passband resulting from its use as an approximation for edges of higher contrast is given. The second approximation considered is due to the practical requirement that the density trace of the edge is truncated. Closed-form edge-density functions derived from transfer functions consisting of weighted superpositions of exponential and gaussian terms were terminated by "visual" inspection and used to evaluate the errors introduced in the computations of the resolution and passband.

3 citations


21 Sep 1970
TL;DR: In this paper, student interpreters viewed nine vertical aerial photographs to judge their interpretability and to answer questions concerning their contents, and a single index called modulation transfer function area (MTFA) was used as the measure of image quality, and this measure was compared with image quality judgments and with photointerpreter performance.
Abstract: : Student military interpreters viewed nine vertical aerial photographs to judge their interpretability and to answer questions concerning their contents. Image quality was manipulated to produce 32 levels of degradation for each of the nine scenes. A single index called modulation transfer function area (MTFA) was used as the measure of image quality, and this measure was compared with image quality judgments and with photointerpreter performance in answering questions about image contents. Product-moment correlations between MTFA and interpreter performance averaged +.72 for the nine scenes; between MTFA and image quality judgments, they averaged +.90; between quality judgments and performance, the average wa +.70. As a criterion for imaging system evaluation and specification MTFA appears worthy of further exploration. (Author)

2 citations


Proceedings ArticleDOI
01 Jun 1970
TL;DR: The one-dimensional line-scan analysis proposed by Beall is extended to encompass a two-dimensional sampling process for evaluating the image transfer in a fiber bundle and it is indicated that there is a significant difference in resolution between the static case and the dynamically-scanned case.
Abstract: A statistical analysis of degradation in a scanned image has been carried out by Beall. (Ref. 1) In the above analysis, the mean-squared value of the error function between the object and image was used as a measure of the degree of fidelity in the image, and numerical data have been published for several particular cases. A similar approach to image transfer in optical fiber bundles is presented. In this paper, the one-dimensional line-scan analysis proposed by Beall is extended to encompass a two-dimensional sampling process for evaluating the image transfer in a fiber bundle. Interesting results occur when the evaluation technique is extended to the comparison of the imaging properties of a static bundle and a bundle which is scanned dynamically. (Ref. 2) This analysis indicated that there is a significant difference in resolution between the static case and the dynamically-scanned case. The difference is found to be largely determined by the fiber configuration within the bundle. Experimental results in the case of the static bundle will be presented.© (1970) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

2 citations


Journal ArticleDOI
01 Jan 1970
TL;DR: The proposed encryption methods are effective for strong-edge images that are suitable for lesion-marked fundus images, while random sign-based JPEG 2000, DFT AOIs, and DCT AOI encrypt the images imperfectly.
Abstract: This paper proposes a copyright- and privacyprotected diabetic retinopathy (DR) diagnosis network. In the network, DR lesions are automatically detected from a fundus image by firstly estimating non-uniform illumination of the image, and then the lesions are detected from the balanced image by using level-set evolution without re-initialization. The lesions are subsequently marked by using contours. The lesion-marked fundus image is subsequently shared for intra or inter hospital network diagnosis with copyright and privacy protection. Watermarking technique is used for image copyright protection, and visual encryption is used for privacy protection. Sign scrambling of two dimensional (2D) discrete cosine transform (DCT) and one dimensional (1D) DCT is proposed for lesion-marked fundus image encryption. The proposed encryption methods are compared with other transform-based encryption methods, i.e., discrete Fourier transform (DFT) amplitude-only images (AOIs), DCT AOIs, and JPEG 2000-based discrete wavelet transform (DWT) sign scrambling which were proposed for image trading system. Since the encryption is done after DR diagnosis, contours used for DR marking must also be visually encrypted. The proposed encryption methods are effective for strong-edge images that are suitable for lesion-marked fundus images, while random sign-based JPEG 2000, DFT AOIs, and DCT AOIs encrypt the images imperfectly. Moreover, the proposed methods are better in terms of image quality. In addition, watermarking performance and compression performance are confirmed by experiments.

1 citations


Journal ArticleDOI
01 Jan 1970
TL;DR: A new design of 2D median filter is presented that utilizes a simple conditional filtering technique, executes fewer computations than related designs while achieving superior image quality.
Abstract: Impulse noise removal is a very important preprocessing operation in many computer vision applications. Usually it is accomplished by median filter with excessive sorting and therefore large power. This paper presents a new design of 2D median filter that utilizes a simple conditional filtering technique, executes fewer computations than related designs while achieving superior image quality. Experimental FPGA implementation of the proposed filtering scheme is compact, fast and low-power consuming.

1 citations


Book ChapterDOI
01 Jan 1970
TL;DR: Optical instability of the earth’s atmosphere impairs the quality of photographs of astronomical objects, and one method of improving image quality is to photograph during periods of relative atmospheric quiet.
Abstract: Optical instability of the earth’s atmosphere impairs the quality of photographs of astronomical objects. One method of improving image quality is to photograph during periods of relative atmospheric quiet.