scispace - formally typeset
Search or ask a question

Showing papers on "Standard test image published in 1991"


Journal ArticleDOI
TL;DR: A modified Bayes decision rule is used to classify a given test image into one of C possible texture classes and the classification power of the method is demonstrated through experimental results on natural textures from the Brodatz album.
Abstract: Consideration is given to the problem of classifying a test textured image that is obtained from one of C possible parent texture classes, after possibly applying unknown rotation and scale changes to the parent texture. The training texture images (parent classes) are modeled by Gaussian Markov random fields (GMRFs). To classify a rotated and scaled test texture, the rotation and scale changes are incorporated in the texture model through an appropriate transformation of the power spectral density of the GMRF. For the rotated and scaled image, a bona fide likelihood function that shows the explicit dependence of the likelihood function on the GMRF parameters, as well as on the rotation and scale parameters, is derived. Although, in general, the scaled and/or rotated texture does not correspond to a finite-order GMRF, it is possible nonetheless to write down a likelihood function for the image data. The likelihood function of the discrete Fourier transform of the image data corresponds to that of a white nonstationary Gaussian random field, with the variance at each pixel (i,j) being a known function of the rotation, the scale, the GMRF model parameters, and (i,j). The variance is an explicit function of the appropriately sampled power spectral density of the GMRF. The estimation of the rotation and scale parameters is performed in the frequency domain by maximizing the likelihood function associated with the discrete Fourier transform of the image data. Cramer-Rao error bounds on the scale and rotation estimates are easily computed. A modified Bayes decision rule is used to classify a given test image into one of C possible texture classes. The classification power of the method is demonstrated through experimental results on natural textures from the Brodatz album. >

323 citations


Patent
12 Sep 1991
TL;DR: In this paper, a mechanism for controlling the manner in which digitized image data files are stored on a digital data storage medium, such as a compact disc, in an opto-electronic image digitizing system was presented.
Abstract: A mechanism for controlling the manner in which digitized image data files are stored on a digital data storage medium, such as a compact disc, in an opto-electronic image digitizing system in which a plurality of images that have been captured on an image storage medium, such as 35 mm film, are converted into a digital image representation and stored as respective high resolution image-representative data files. For each high resolution image-representative data file, an associated low resolution digitized image is stored within a low resolution index image data file. Selected ones or all of the low resolution image-representative data files within the index file may be read out and displayed as corresponding low resolution portions of a montage image to facilitate rapid viewing of the images.

130 citations


Patent
21 Mar 1991
TL;DR: In this article, a negative feedback chain is designed to correct all the faults of the projected image, said faults resulting as much from the projection tubes and from their optic systems (geometrical or focusing faults) as from the differences between these tubes (faults in convergence, colorimetry or uniformity of brilliance).
Abstract: A projection display device is provided with a negative feedback chain designed to correct all the faults of the projected image, said faults resulting as much from the projection tubes and from their optic systems (geometrical or focusing faults) as from the differences between these tubes (faults in convergence, colorimetry or uniformity of brilliance). The negative feedback chain comprises a test chart generating means, a means to retake the entire projected image and means for the comparision of the signals coming from the test image and from the retaken image. Correction signals are prepared on the basis of this comparison and are applied to the different control circuits defining the projection characteristics. The disclosed device can also be used for the real-time follow-up of the adjustments obtained in a phase prior to the projection of the sequence of useful video images. The invention can be applied notably to video projectors, retro-projection display devices or beam-index tubes.

82 citations


Proceedings ArticleDOI
01 Mar 1991
TL;DR: In this paper, a least squares fit of a paraboloid function to the surface generated by correlating a reference image feature against a test image search area is used to locate unique image features to subpixel accuracies.
Abstract: A digiuil image processing inspection system is under development at Oak Ridge National Laboratory that will locateimage features on printed material and measure distances between them to accuracies of 0.001 in. An algorithm has beendeveloped for this system that can locate unique image features to subpixel accuracies. It is based on a least-squares fit ofa paraboloid function to the surface generated by correlating a reference image feature against a test image search area.Normalizing the correlation surface makes the algorithm robust in the presence of illumination variations and local flaws.Subpixel accuracies of better than 1/16 of a pixel have been achieved using a variety of different reference image features. 1. INTRODUCTIONAn algorithm has been developed at Oak Ridge National Laboratory (ORNL) that can be used to locate a variety offeatures or objects in a digitized image to subpixel accuracies. The algorithm uses a set of normalized correlation j1 generated by correlating a stored reference template against a test image search area that is believed to contain the feature

47 citations


Proceedings ArticleDOI
01 Aug 1991
TL;DR: A space-filling Hilbert curve is used for mapping the two-dimensional image into a suitable one-dimensional representation and this topological mapping is spatially non-disruptive and tends to preserve local pixel correlations in the original two- dimensional image.
Abstract: This paper outlines the use of space-filling curves in transform image compression. Specifically, a space-filling Hilbert curve is used for mapping the two-dimensional image into a suitable one-dimensional representation. Compared to simple raster-scans, this topological mapping is spatially non-disruptive and tends to preserve local pixel correlations in the original two-dimensional image. Standard transform coefficient reduction and coding techniques can then be applied to the one-dimensional representation for the purposes of data compression. The advantages of the one-dimensional coding, in terms of computational cost and subjective image quality, are also discussed.

42 citations


Patent
22 Mar 1991
TL;DR: In this paper, an integrated image recorder (100) receives first pictorial image data from an input scanner (300) and stores the same in an image disk (150), the pixels of the picture are skipped at a predetermined rate to obtain second pictorial data having a lower resolution.
Abstract: An integrated image recorder (100) receives first pictorial image data from an input scanner (300) and stores the same in an image disk (150). The pixels of the picture are skipped at a predetermined rate to obtain second pictorial image data having a lower resolution. The second pictorial image data is transmitted to a front end processor (201, 202) and employed in editing/designing of a page as an integrated image. The integrated image is represented in a page description language and is delivered to the integrated image recorder. The integrated image recorder is operable to read out the first pictorial image data from the image disk and to convert the integrated image into a bit map using the first pictorial image. The integrated image is then delivered to an output scanner (190) and is recorded on a recording medium.

25 citations


Book ChapterDOI
30 Sep 1991
TL;DR: A data redistribution algorithm which aims at dynamically balancing the workload of image processing algorithms on distributed memory processors is introduced and the usefulness of the redistribution strategy is demonstrated by comparing the efficiency obtained with and without the elastic algorithm for a thinning algorithm which aiming at extracting the skeleton of a binary image.
Abstract: In this paper, we introduce a data redistribution algorithm which aims at dynamically balancing the workload of image processing algorithms on distributed memory processors. First we briefly review state-of-the-art techniques for load balancing application-specific algorithms. Then we describe the data redistribution technique, which we term “elastic load balancing” in a general framework. We demonstrate the usefulness of our redistribution strategy by comparing the efficiency obtained with and without the elastic algorithm for a thinning algorithm which aims at extracting the skeleton of a binary image. We report experimental results obtained with a Supernode machine, based upon reconfigurable networks of 32 Transputers [Nic]. We obtain a speedup of up to 28 over the sequential algorithm, using a Mandelbrot set as a test image. Note that the speedup with a static allocation of the picture was limited to 17 with the same test image, due to the load imbalance among the processors.

19 citations


Patent
21 Jun 1991
TL;DR: In this paper, an image is tested to determine whether it is substantially like a plurality of sample images by computing statistical information about the sample images and using that statistical information to analyze the image being tested.
Abstract: An image is tested to determine whether it is substantially like a plurality of sample images by computing statistical information about the sample images and using that statistical information to analyze the image being tested. The analysis is such that the statistical parameters are standardized so that available tables of the central chi-square distribution function can be used, thereby simplifying the necessary calculations. If desired, images which have been determined by the method of this invention to be good can be used to refine the statistical information used in analyzing subsequent images. When an image is identified as unacceptable by the method of the invention, the data for that image can be decomposed in order to identify the parts of the image which are causing the image to be unacceptable.

17 citations


Journal ArticleDOI
TL;DR: The authors propose a computational technique for the directional analysis of piecewise linear patterns in images based on the notion of zero crossings in gradient images that has significant applications in quantitative analysis of ligament healing and in comparison of treatment methods for ligament injuries.
Abstract: The authors propose a computational technique for the directional analysis of piecewise linear patterns in images based on the notion of zero crossings in gradient images. A given image is preprocessed by a sequence of filters that are second derivatives of 2-D Gaussian functions with different scales. This gives a set of zero-crossing maps (the scale space) from which a stability map is generated. Significant linear patterns are detected from measurements on the stability map. Information regarding orientation of the linear patterns in the image and the area covered by the patterns in specific directions is then computed. The performance of the method is illustrated through applications to a simple test image made up of straight bar patterns as well as to scanning electron microscope images of collagen fibrils in rabbit ligaments. The method has significant applications in quantitative analysis of ligament healing and in comparison of treatment methods for ligament injuries. >

15 citations


Patent
16 Jul 1991
TL;DR: In this article, a triangulation-based approach is used for the inspection of soldered joints in circuit board modules in electronics, where three-dimensional elevation images and two-dimensional grey-scale images of an object under test are simultaneously recorded in digitised form and plausibility checks are carried out between the two different types of image information items (in each case at corresponding locations).
Abstract: In most cases, optical test methods are used for image analysis, especially for grading circuit board modules in electronics. Laser triangulation can be used for recording three-dimensional elevation images in digitised form. During the processing of the grey-scale image, three-dimensional data are derived from two-dimensional grey-scale images which reproduce the brightness distribution. Since three-dimensional image analysis by means of triangulation has significant advantages, the inspection of soldered joints is also to be carried out by means of this method. For this purpose, three-dimensional elevation images and two-dimensional grey-scale images of an object under test are simultaneously recorded in digitised form and plausibility checks are carried out between the two different types of image information items (in each case at corresponding locations). Thus, an information item which cannot be identified due to measuring errors in the three-dimensional elevation image can be replaced by the corresponding information items of the two-dimensional grey-scale image.

14 citations


Patent
22 Mar 1991
TL;DR: In this paper, the authors proposed a feedback mechanism for correcting all the defects exhibited by projected image, these defects resulting both from the projection tubes and from their optics (geometric or focusing defects) and from differences between these tubes (defects of convergence, colorimetry, uniformity of brightness).
Abstract: The invention relates to a projection-type display apparatus provided with a feedback chain intended to correct all the defects exhibited by the projected image, these defects resulting both from the projection tubes and from their optics (geometric or focusing defects) and from differences between these tubes (defects of convergence, colorimetry, uniformity of brightness). The feedback chain includes a generator means (6) for a test-card image, a means (3) of recapturing the entire projected image, and means (4) of comparing the signals output by the test image and by the recaptured image; correction signals are formulated from this comparison and are applied to the various control circuits (5) defining the projection characteristics. … The device according to the invention is used also in the real-time tracking of adjustments obtained in a phase prior to the projecting of the sequence of useful video images. … The invention applies in particular to video projectors, to overhead projector display devices or to beam-index tubes. … …

Patent
Warren Paul1
14 Nov 1991
TL;DR: In this paper, a first memory is used to store a plurality of data words corresponding to different pixels of a test image, each data word being set to a common test data value.
Abstract: An image processing apparatus comprises a first memory for storing a plurality of data words corresponding to different pixels of a test image, each data word being set to a common test data value. The data words are converted into one or more analog video signals using an digital to analog convertor. The one or more video signals are converted, using an analog to digital convertor, into captured data values with each captured data value corresponding to a different one of the data words stored in the first memory. The captured data values are stored in a second memory and averaged to generate a mean captured value. Any difference between the mean captured value and the test data value is determined to identify any amplification error in the captured data values. First and second tolerance values are set to be respectively greater than and less than the mean captured value. The captured data values between the first and second tolerance limits are then counted to identify any quantization errors in the captured data values.

Journal ArticleDOI
TL;DR: The fundamental result obtained was that regularization can improve the quality of reconstructed images with respect to the traditional least squares method for particularly small data sets and low signal-to-noise ratios.

Patent
25 Jul 1991
TL;DR: In this article, the patient is placed between an X-ray source and a fluorescent screen, which is excited by a scanning laser beam, and the resulting signal from a photomultiplier is fed via a voltage transformer and an amplifier to an analogue-digital converter.
Abstract: The patient (M) is placed between an X-ray source (1) and a fluorescent screen (4), which is excited by a scanning laser beam. The resulting signal from a photomultiplier (10) is fed via a voltage transformer (11) and an amplifier (12) to an analogue-digital converter (13). The digital image data is read into a line buffer in the image processing unit (14). The image can be analysed for high-special-frequency contents before processing algorithms are selected. Processing may include contrast enhancement, unsharp masking, and selected interpolation processes for image scaling. ADVANTAGE - Produces good image in selected region in short processing time.


15 Feb 1991
TL;DR: A set of compressed, then reconstructed, test images submitted to the Comet Rendezvous Asteroid Flyby (CRAF)/Cassini project is presented as part of its evaluation of near lossless high compression algorithms for representing image data.
Abstract: A set of compressed, then reconstructed, test images submitted to the Comet Rendezvous Asteroid Flyby (CRAF)/Cassini project is presented as part of its evaluation of near lossless high compression algorithms for representing image data. A total of seven test image files were provided by the project. The seven test images were compressed, then reconstructed with high quality (root mean square error of approximately one or two gray levels on an 8 bit gray scale), using discrete cosine transforms or Hadamard transforms and efficient entropy coders. The resulting compression ratios varied from about 2:1 to about 10:1, depending on the activity or randomness in the source image. This was accomplished without any special effort to optimize the quantizer or to introduce special postprocessing to filter the reconstruction errors. A more complete set of measurements, showing the relative performance of the compression algorithms over a wide range of compression ratios and reconstruction errors, shows that additional compression is possible at a small sacrifice in fidelity.

Proceedings ArticleDOI
03 Jun 1991
TL;DR: A method which was developed for finding common points from digitized video images is semiautomatic, guided by the user and is based on the least squares image matching method.
Abstract: This paper reports a method which was developed for finding common points from digitized video images. The process of finding common points is semiautomatic, guided by the user and is based on the least squares image matching method. To find a pair of control points from two images, the location of the first point has to be pointed out manually from the first image and then the corresponding point is automatically located from the second image. As an application a large image mosaic and a smaller stereo image mosaic were produced using consecutive images.

Proceedings ArticleDOI
01 Apr 1991
TL;DR: This paper examines the effects of a modification of the mean squared error (MSE) based on the subjective importance of edges and presents computer simulations on images of a standard data set with various noise densities and investigates the application of Ll-filters with the perception-constrained cost function.
Abstract: A basic problem in digital image restoration is the smoothing or blurring of edges due to the averaging effects of most techniques. Filtering results in images that are less noisy but less sharp as well. In order to be able to effectively reduce noise while maintaining a degree of sharpness we define a cost function constrained to reflect a perception-related criterion. In this paper we examine the effects of a modification of the mean squared error (MSE) based on the subjective importance of edges. We have studied both the space-invariant and space-varying filters with standard edge detection operators. The static approach offers a simple and parallel implementation while the adaptive one gives better performance regarding sharpness and subjective quality. We present computer simulations on images of a standard data set with various noise densities and investigate the application of Ll-filters with the perception-constrained cost function. Also an analysis of the robustness of filters is included for cases when a test image is not available.

Patent
25 Oct 1991
TL;DR: In this article, the authors propose to facilitate test generation and to shorten the generation time by storing test questions in a data base, and combining them and generating a test CONSTITUTION.
Abstract: PURPOSE:To facilitate test generation and to shorten the generation time by storing test questions in a data base, and combining them and generating a test CONSTITUTION:A display input device 1 is used to input values required for retrieval, display the retrieval result, input test constitution, display a test image, and indicate test output, a data input/output part 2 controls input data and display data, and a data base retrieval part 3 retrieves question attributes and test attributes Further, the constitution of the test, the screen display of the test, and printer output are controlled by a test output part 4 and a test form is outputted by an output device 5 Then the data base 60 is stored with attribute information on the questions in a file 6, attribute information on the test in a file 7, layout information on the test in a file 8, and image information on the questions in a file 9 The contents of the respective files are combined to generate the test

Journal ArticleDOI
TL;DR: The new objective picture quality scale obtained herein can estimate the mean opinion scores (MOS) of two kinds of previously untaught images whose characteristics differ greatly.
Abstract: It is important to improve the picture quality of an image reconstructed by computer simulation. It is also important that the performance of coders be evaluated. To this end, an objective picture quality scale is essential for image processing. This paper analyzes human perceptual properties in detail, formulizes approximately the fundamental image disturbance factors and proposes an objective picture quality scale (PQS). As a test image, a coding error image was fed into a neural network and experiments were conducted with a method to evaluate the picture quality. Finally, the new objective picture quality scale obtained herein can estimate the mean opinion scores (MOS) of two kinds of previously untaught images whose characteristics differ greatly.

Proceedings ArticleDOI
01 May 1991
TL;DR: Compression methods based on Gabor functions are implemented for simulated nuclear medicine liver images with and without simulated lesions, and better than 2: 1 compression is obtained without significant degradation in image quality.
Abstract: Compression methods based on Gabor functions are implemented for simulated nuclear medicine liver images with and without simulated lesions. The quality of the compression schemes are assessed objectively by comparing the original images to the compressed images through calculation of the Hotelling trace, an index shown to track with human observers for images from this modality.1 For compression based on thresholding the complex Gabor coefficients, better than 2: 1 compression is obtained without significant degradation in image quality.

Patent
14 Aug 1991
TL;DR: In this article, a test image generator for a colored T.V. set is presented, which consists of a main pulse oscillator, a scanning controller, a circle controller, an image element memory, a demoding circuit, a D/A converter, and an NTSC/PAL system encoder.
Abstract: The utility model discloses a test image generator for a colored T.V. set. The utility model comprises a main pulse oscillator, a scanning controller, a circle controller, a controller for areas of image elements, an image element memory, a demoding circuit, a D/A converter, and an NTSC/PAL system encoder. The utility model can generate test images for a colored T.V. set and output video composite T.V. set signals. The utility model which uses digital and analog integrated circuits has the advantages of simple structure, reliable performance, and convenient adjustment.

Patent
14 Jun 1991
TL;DR: In this paper, the authors proposed a method to improve a large image by analyzing and calculating continuos defects from a displayed image, and operating correction necessary for each device used for forming the image.
Abstract: PURPOSE: To improve a large image by analyzing and calculating continuos defects from a displayed image, and operating correction necessary for each device used for forming the image. CONSTITUTION: A image capturing device which analyzes a screen through a partial filter for automatic defect correction is constituted of a lens O1 , a second lens O2 which applies a hold mask image to the whole input area of a photodetector PM, and a hold mask (test image) MT arranged in the same distance for the second lens O2 . The capturing device detects the maximum signal when the spatial filter is exactly suited to this test image, and when one pixel of the displayed test image is at the exact position. Then, an image to be displayed is analyzed through the spatial filter, and scanning, convergence, and amplitude correction is updated as necessary, and a stored value is changed according to the analyzed result. Thus, the image quality of a large picture is improved.

Patent
18 Jun 1991
TL;DR: In this paper, the eye fixation control method involves using a video camera to determine the position and/or orientation of the patient's eye, with the camera image being displayed on a monitor.
Abstract: The eye fixation control method involves using a video camera to determine the position and/or orientation of the patient's eye, with the camera image being displayed on a monitor. The image is digitised and stored in memory with a test image for the eye. The video camera image is reproduced at the same time as the test image to allow orthalmological measurements to determine the deviation of the eye from desired values. Measurements are rejected when the eye is covered by the eye-lid. ADVANTAGE - Simple and exact

Patent
18 Dec 1991
TL;DR: In this article, the authors proposed an interval checking interval to minimize the time needed for transmission of the useless data obtained after occurrence of a fault by carrying out the interval check with use of the test image information received based on the result of decision of a reception deciding means.
Abstract: PURPOSE:To minimize the time needed for transmission of the useless data obtained after occurrence of a fault by carrying out the interval check with use of the test image information received based on the result of decision of a reception deciding means. CONSTITUTION:A 1st device 10 of the transmission side detects the medical image data every time the transmission time of the image data exceeds a fixed time set previously and transmits the interval checking test image information A together with the additional information including the identification information. Meanwhile a 2nd device 20 of the reception side decides the received information A based on the identification information on the additional information of the information A and performs the interval check with use of the information A. Thus the time wasted with occurrence of a fault never exceeds the interval when the interval check prescribed the fixed time set previously is carried out. Then this interval checking interval is minimized and the time to be wasted can be minimized.

Proceedings ArticleDOI
01 Feb 1991
TL;DR: The impact of the use of image enhancement is shown to increase significantly the cost benefit of using ICR equipment and substantiates the relationship of dollars to image quality of scanned documents.
Abstract: Image quality of scanned documents is examined in the context of a data entry department reading data from the images with intelligent character recognition (ICR) equipment. First, the equipment and its use are reviewed. An economic model of the cost benefit of the ICR equipment is developed, and the cost benefit of a specific application is examined. To complete the analysis, a test deck of documents is used to measure the specific parameters that relate to image quality and the performance of the equipment. These results are analyzed to show the impact of image quality on operating costs. A novel image processing algorithm which dramatically improves the quality of one bit per pixel images is shown to work with very low contrast documents. It also works on documents with noisy and/or dark backgrounds. A comparison of the enhanced images to those produced by off the shelf scanners is then made with several test documents. A test is made of the ICR system with the new algorithm in place to produce high quality images. The impact of the use of image enhancement is then shown to increase significantly the cost benefit of using ICR equipment. This test substantiates the relationship of dollars to image quality of scanned documents.© (1991) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Proceedings ArticleDOI
01 Aug 1991
TL;DR: A semi-automatic aircraft engine component motion registration system that was replaced by the use of several interactive programs running on a personal computer, and attempts were made to fully automate the process, replacing the human expert with equivalent image understanding routines.
Abstract: Inspection of industrial images can be a laborious task. Automating the inspection using image processing techniques works effectively only with an appropriate human interface. This paper describes a semi-automatic aircraft engine component motion registration system. Manual inspection of aircraft engine x-ray data was replaced by the use of several interactive programs running on a personal computer. This system allowed the inspector to digitize, process, tabulate, and document test image sequences without requiring image processing experience. The new environment also provided a digital replacement for the analog densitometer previously used, as well as enabling the extraction of digital templates of arbitrary size. Once two masks were selected, measurements could be performed by correlating the pair with a sequence of images, in a batch process. Calibrated measurement results were sent automatically to file, printer, or screen; hardcopy output of found templates, superimposed on individual test images was used for visual verification. Several image processing techniques for performing correlation were surveyed and three of them were implemented. Complexity, speed, and accuracy of each are presented. The methods implemented were direct normalized cross-correlation, hierarchical normalized spatial cross-correlation, and Fourier transform based cross-correlation (using an array processor). Extensions for scale and rotational invariance are also discussed. Attempts were made to fully automate the process, replacing the human expert with equivalent image understanding routines. The methods used by the expert to select templates were criteria such as edge detail, contrast, and local histograms. These strategies were applied to automatically selected templates containing desired measurement points. Results and limitations are discussed.© (1991) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Proceedings ArticleDOI
31 Oct 1991
TL;DR: This work presents a video-based computerized method imposing minimal experimental constraints and exhibiting encouraging performance while utilizing reasonable off-ihe-sherf hardware and software remwces.
Abstract: Eye movements have been studied for over 40 years Various image processing methods have been applied These range from low sampling mMl(al measurements on 35 mm photographs [I] to sophisticated infrored digitized images processed by dedicated way processors driven bypvefd computers [Z] Each method introduces constraints on the experiment conditions (lighring head fwtion eye discomfort, iral lardmarks) as well as on the processing time We present a video-based computerized method imposing minimal experimental constraints and exhibiting encouraging performance while utilizing reasonable off-ihe-sherf hardware and software remwces

Proceedings ArticleDOI
01 Aug 1991
TL;DR: Software tools developed to design the custom filters used on the initial processing stage in digital image analysis system and Laplacian of Gaussian median morphology are concerned.
Abstract: Early stages of digital image processing reguire suitable application of the preprocessing operators. This paper concerns software tools developed to design the custom filters used on the initial processing stage in digital image analysis system. Subject terms: image processing convolution Laplacian of Gaussian median morphology.© (1991) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Proceedings ArticleDOI
01 May 1991
TL;DR: The use of DCT file compression yields acceptable performance for skin lesion images since differences in morphology recognition performance do not correlate significantly with the use of original photos versus compressed versions and image quality evaluation does not correlated significantly with level of compression.
Abstract: The research reported in this paper concerns an evaluation of the impact of compression on the quality of digitized color dermatologic images. 35 mm slides of four morphologic types of skin lesions were captured at 1000 pixels per inch (ppi) in 24 bit RGB color, to give an approximate 1K X 1K image. The discrete cosine transform (DCT) algorithm, was applied to the resulting image files to achieve compression ratios of about 7:1, 28:1, and 70:1. The original scans and the decompressed files were written to a 35 mm film recorder. Together with the original photo slides, the slides resulting from digital images were evaluated in a study of morphology recognition and image quality assessment. A panel of dermatologists was asked to identify the morphology depicted and to rate the image quality of each slide. The images were shown in a progression from highest level of compression to original photo slides. We conclude that the use of DCT file compression yields acceptable performance for skin lesion images since differences in morphology recognition performance do not correlate significantly with the use of original photos versus compressed versions. Additionally, image quality evaluation does not correlate significantly with level of compression.© (1991) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.