scispace - formally typeset
Search or ask a question

Showing papers on "Histogram equalization published in 1995"


Proceedings ArticleDOI
TL;DR: Two new color indexing techniques are described, one of which is a more robust version of the commonly used color histogram indexing and the other which is an example of a new approach tocolor indexing that contains only their dominant features.
Abstract: We describe two new color indexing techniques. The first one is a more robust version of the commonly used color histogram indexing. In the index we store the cumulative color histograms. The L1-, L2-, L(infinity )-distance between two cumulative color histograms can be used to define a similarity measure of these two color distributions. We show that this method produces slightly better results than color histogram methods, but it is significantly more robust with respect to the quantization parameter of the histograms. The second technique is an example of a new approach to color indexing. Instead of storing the complete color distributions, the index contains only their dominant features. We implement this approach by storing the first three moments of each color channel of an image in the index, i.e., for a HSV image we store only 9 floating point numbers per image. The similarity function which is used for the retrieval is a weighted sum of the absolute differences between corresponding moments. Our tests clearly demonstrate that a retrieval based on this technique produces better results and runs faster than the histogram-based methods.

1,952 citations


Journal ArticleDOI
James Lee Hafner1, Harpreet Sawhney1, W. Equitz1, Myron D. Flickner1, W. Niblack1 
TL;DR: In this paper, the authors proposed the use of low-dimensional, simple to compute distance measures between the color distributions, and showed that these are lower bounds on the histogram distance measure.
Abstract: In image retrieval based on color, the weighted distance between color histograms of two images, represented as a quadratic form, may be defined as a match measure. However, this distance measure is computationally expensive and it operates on high dimensional features (O(N)). We propose the use of low-dimensional, simple to compute distance measures between the color distributions, and show that these are lower bounds on the histogram distance measure. Results on color histogram matching in large image databases show that prefiltering with the simpler distance measures leads to significantly less time complexity because the quadratic histogram distance is now computed on a smaller set of images. The low-dimensional distance measure can also be used for indexing into the database. >

822 citations


Journal ArticleDOI
TL;DR: In this paper, two color matching methods, the Reference Color Table Method (CTM) and the Distance Method (DM), were proposed for image retrieval. And the results show that both the new methods perform better than the existing method, and that the reference color table method gives the best results.

195 citations


Journal ArticleDOI
TL;DR: A normalized complement transform has been proposed to simplify the analysis and the implementation of the LIP model-based algorithms and this new implementation has been compared with histogram equalization and Lee's original algorithm.
Abstract: Describes a new implementation of Lee's (1980) image enhancement algorithm. This approach, based on the logarithmic image processing (LIP) model, can simultaneously enhance the overall contrast and the sharpness of an image. A normalized complement transform has been proposed to simplify the analysis and the implementation of the LIP model-based algorithms. This new implementation has been compared with histogram equalization and Lee's original algorithm. >

129 citations


Patent
24 Nov 1995
TL;DR: In this paper, a system and method for diagnosis of living tissue diseases is described, which includes a computer device for controlling its operation, coupled with a viewing screen for displaying digitized images of the living tissue.
Abstract: A system and method for diagnosis of living tissue diseases is described. The system includes a computer device for controlling its operation. An operator control device is coupled to the computer device. A viewing screen is coupled to the computer device for displaying digitized images of the living tissue. The operator, using the control device, selects desired portions of the digitized image for further image enhancement according to a desired image enhancement feature selectable from a plurality of image enhancement features. The image enhancement features include any combination of grey scale stretching, contrast enhancement based on logarithmic histogram equalization, spot enhancement and magnification. The system further includes means for visualization and quantification of micro-calcifications, and means for visualization and quantification of mass spiculations.

108 citations


Patent
30 Oct 1995
TL;DR: In this article, the histogram is divided into clusters using a pattern matching technique and then histogram equalization or stretching is performed on each cluster to produce a modified histogram.
Abstract: A method of operating a computer to produce contrast enhanced digital images commences with the step of producing a histogram of having a first axis corresponding to a measurable property (e.g., luminance) and a second axis corresponding to a count of pixels having a particular value for the measurable property. This histogram is divided into clusters and histogram equalization or stretching is performed on each cluster thereby producing a modified histogram. Using said modified histogram to adjust the value of said first measurable property in said digital form, thereby producing a contrast enhanced image. The histogram is divided into clusters using a pattern matching technique. For example, patterns in the histogram that resemble gaussian distributions and patterns that resemble uniform distributions are separated into individual clusters.

83 citations


Patent
23 Jan 1995
TL;DR: In this article, a method and apparatus for the analysis and correction of the image gradation of an image original to be reproduced by evaluating image values acquired by point-bypoint and line-by-line, optoelectronic scanning with an input device in apparatus and systems for image processing is presented.
Abstract: A method and apparatus for the analysis and correction of the image gradation of an image original to be reproduced by evaluating image values acquired by point-by-point and line-by-line, optoelectronic scanning with an input device in apparatus and systems for image processing. The image original is geometrically subdivided into a plurality of sub-images. The frequency distribution of the image values or, respectively, of the luminance components of the color values in a corresponding sub-image is separately identified as a sub-image histogram. The sub-image histograms of the individual sub-images are evaluated and the sub-images relevant for the image gradation are identified by means of the evaluation. An aggregate histogram that corresponds to the frequency distribution of the image values or, respectively, of the luminance component of the color values in the relevant sub-images is calculated from the sub-image histograms of the relevant sub-images. Correction values for the correction of the image gradation characteristic of the image original are subsequently calculated from the aggregate histogram according to the method of histogram modification.

59 citations


Patent
William A. Fuss1, Reiner Eschbach1
05 Jun 1995
TL;DR: In this article, a histogram of an image is derived from a selected subset of local histograms representing regions of the image, which are used for controlling the TRC mapping in a device at which the image is to be printed.
Abstract: A method of improving the contrast in a natural scene image. A relevant histogram of the image is derived for from a selected subset of local histograms representing regions of the image. The signal describing the histogram is operated on with a filter having the characteristic of weakening strong peaks and valleys in the function, but not effecting flat portions of the signal. The filtered histogram signal is used for controlling the TRC mapping in a device at which the image is to be printed. To assure optimum selection of local histograms, regions including the black point and white point of an image are determined and added to the subset of local histograms representing regions of the image.

59 citations


Journal ArticleDOI
TL;DR: This work proposes a method for designing an optimized universal color palette for use with halftoning methods such as error diffusion, and employs a new vector quantization method known as sequential scalar quantization (SSQ) to allocate the colors in a visually uniform color space.
Abstract: Currently, many low-cost computers can only simultaneously display a palette of 256 colors. However, this palette is usually selectable from a very large gamut of available colors. For many applications, this limited palette size imposes a significant constraint on the achievable image quality. We propose a method for designing an optimized universal color palette for use with halftoning methods such as error diffusion. The advantage of a universal color palette is that it is fixed and therefore allows multiple images to be displayed simultaneously. To design the palette, we employ a new vector quantization method known as sequential scalar quantization (SSQ) to allocate the colors in a visually uniform color space. The SSQ method achieves near-optimal allocation, but may be efficiently implemented using a series of lookup tables. When used with error diffusion, SSQ adds little computational overhead and may be used to minimize the visual error in an opponent color coordinate system. We compare the performance of the optimized algorithm to standard error diffusion by evaluating a visually weighted mean-squared-error measure. Our metric is based on the color difference in CIE L *a *b , but also accounts for the lowpass characteristic of human contrast sensitivity.

54 citations


Journal ArticleDOI
TL;DR: A new multivariate enhancement technique the authors have named "histogram explosion" is able to exploit nearly the full RGB extent without clipping, and can preserve original hue values when parameters are chosen properly.
Abstract: Multispectral and true-color images are often enhanced using histogram-based methods, usually by adjustment of color components after transformation to a selected secondary color system. Enhancement aimed toward the preservation of certain important perceptual qualities generally calls for the secondary coordinate system to be perceptually based. However, independent modification of the secondary components seldom uses the full extent of the RGB gamut unless some color values are clipped at the RGB boundaries. Preserving perceptual attributes is sometimes less important than obtaining the greatest possible color contrast improvement. This is especially true for color composites derived from multispectral images, which have no significant basis in human perception. A new multivariate enhancement technique the authors have named "histogram explosion" is able to exploit nearly the full RGB extent without clipping. While not generally based upon a perceptual model, the method can preserve original hue values when parameters are chosen properly. Experimental results of histogram explosion are presented, along with an analysis of its computational complexity. >

53 citations


Proceedings ArticleDOI
22 Oct 1995
TL;DR: A theoretical analysis of luminance modification in commonly used coordinate systems, such as HSI, LHS, and YIQ, and results using histogram equalization support the theoretical analysis.
Abstract: In many applications of color image processing, only modification of the luminance component is desired. However, the commonly used coordinate systems, such as HSI, LHS, and YIQ, are not perceptually orthogonal; that is, luminance modification can cause perceptual shifts in the hue and saturation. In this paper, the authors present a theoretical analysis of this phenomenon. Efficient techniques are developed for bypassing the costly coordinate transformations when only the luminance or only the saturation is to be modified. Experimental results using histogram equalization support the theoretical analysis.

Proceedings ArticleDOI
TL;DR: A hierarchical indexing scheme where computationally efficient features are used to subset the image before more sophisticated techniques are applied for precise retrieval of image databases is proposed.
Abstract: We present two new approaches based on color histogram indexing for content-based retrieval of image databases. Since the high computational complexity has been one of the main barriers towards the use of similarity measures such as histogram intersection in large databases, we propose a hierarchical indexing scheme where computationally efficient features are used to subset the image before more sophisticated techniques are applied for precise retrieval. The use of histograms at different color resolutions as filtering and matching features in a hierarchical scheme is studied. In the second approach, a multiresolution representation of the histogram using the indices and signs of its largest wavelet coefficients is examined. Excellent results have been observed using the latter method.

Proceedings ArticleDOI
Moon-Soo Chang1, Sun-Mee Kang1, Woo-Sik Rho1, Heok-Gu Kim1, Duck Jin Kim1 
14 Aug 1995
TL;DR: An integrated binarization scheme is developed, exploiting synergistic use of an adaptive thresholding technique and variable histogram equalization to counter the stroke connectivity problems of characters arising from mid-level-quality binary image scanning systems.
Abstract: A binarization method is presented to counter the stroke connectivity problems of characters arising from mid-level-quality binary image scanning systems. In the output of a binary image scanning system, separate strokes may look connected if the point size is small and the character strokes are complex while strokes may lose connectivity if they are generated at low intensity. Also, erroneous recognition may result if a blemished document surface distorts the image. To counter these problems and to further enhance the quality of character recognition, the authors have developed an integrated binarization scheme, exploiting synergistic use of an adaptive thresholding technique and variable histogram equalization. This algorithm is composed of two components. The first removes background noise via gray level histogram equalization while the second enhances the gray level of characters over and above the surrounding background via an edge image composition technique.

Patent
03 May 1995
TL;DR: Dynamic histogram warping is performed on histograms extracted from an image pair of a scene as mentioned in this paper. And the warped histograms are remapped to the image pair and the resulting remapped image pair is subsequently subjected to image processing.
Abstract: Dynamic histogram warping is performed on histograms extracted from an image pair of a scene. The warped histograms are remapped to the image pair and the resulting remapped image pair is subsequently subjected to image processing.

Patent
Jin H. Kim1, Dong-Chyuan Liu1
29 Sep 1995
TL;DR: In this article, a color map look-up table is defined from the processed gray-scale image data to define a color palette of K colors, (eg, K=4-16) Histogram analysis, nonlinear mapping or segmentation processing is used to map the gray scale levels to the color palette.
Abstract: Ultrasound gray-scale image data and corresponding processed gray-scale image data are color-enhanced to generate an enhanced multi-color image At one step, a color map look-up table is defined from the processed gray-scale image data to define a color palette of K colors, (eg, K=4-16) Histogram analysis, nonlinear mapping or segmentation processing is used to map the gray-scale levels to the color palette The processed gray-scale image data then is transposed into a transposed color image data set Data interpolation then is performed in which each pixel of the original gray-scale image data is interpolated with a corresponding pixel among the transposed color image data set Interpolation weighting is adjustable by the user to determine how much detail from the original gray scale image is preserved

Proceedings ArticleDOI
14 Aug 1995
TL;DR: A circular histogram thresholding for color image segmentation using a circular hue histogram based on a UCS (I,H,S) color space based on the maximum principle of variance is proposed.
Abstract: A circular histogram thresholding for color image segmentation is proposed. A circular hue histogram is first constructed based on a UCS (I,H,S) color space. The histogram is automatically smoothed by a scale-space filter, then transformed into traditional histogram form, and finally recursively thresholded based on the maximum principle of variance. Three comparisons of performance are reported: (i) the proposed thresholding on the circular histogram with that on a traditional histogram; (ii) the proposed thresholding with clustering; and (iii) thresholding based on a UCS hue attribute with that based on a non-UCS hue attribute. Benefits of the proposed approach are confirmed in experiments.

Journal ArticleDOI
TL;DR: The computationally efficient adaptive neighborhood extended contrast enhancement procedure is proposed, which achieves significant computational speedup without much loss of image quality and can be well applied to other contrast enhancement algorithms for improvement of the quality of the enhanced image.

Journal ArticleDOI
TL;DR: An adaptive neighborhood-clustering algorithm, which searches the neighboring color indices of input pixels iteratively, is proposed to further improve the performance of the dependent scalar quantization algorithm.
Abstract: Many image display devices can allow only a limited number of colors, called color palette, to be simultaneously displayed. In order to have a faithful color reproduction of an image, the associated color palette must be suitably designed. This paper presents a dependent scalar quantization algorithm to design the color palette effectively. The dependent scalar quantization algorithm consists of two procedures, bit allocation and recursive binary moment preserving thresholding. The experimental results show that the dependent scalar quantization can reduce the computation complexity and its output images quality is acceptable to the human eyes. A rule of the quantization order is also deduced under the MSE criterion to obtain a dependent scalar quantizer which has as good a performance as compared with some other algorithms. In addition, an adaptive neighborhood-clustering algorithm, which searches the neighboring color indices of input pixels iteratively, is proposed to further improve the performance of the dependent scalar quantization algorithm. Finally, we introduce a color mapping method to reduce the contouring effect when the color palette size generated by the dependent scalar quantizer is small. >

Journal ArticleDOI
TL;DR: The 3-D histogram visualization offers a clear and intuitive representation of the color distribution of an image and is applied to derive a clusterization technique for color classification and visualization, to display comparatively the gamut of different color devices, and to detect the misalignment of the Rc3B planes of a color image.
Abstract: A visualization procedure for the 3-D histogram of color images is presented. The procedure assumes that the histogram is available as a table that associates to a pixel color the number of its appearances in the image. The procedure runs for the RGB, YMC, HSV, HSL, L*a*b*, and L*u*v* color spaces and it is easily extendable to other color spaces if the analytical form of color transformations is available. Each histogram value is represented in the color space as a colored ball, in a position corresponding to the place of the color in the space. A simple drawing procedure is used instead of more complicated 3-D rendering techniques. The 3-D histogram visualization offers a clear and intuitive representation of the color distribution of an image. The procedure is applied to derive a clusterization technique for color classification and visualize its resuIts, to display comparatively the gamut of different color devices, and to detect the misalignment of the Rc3B planes of a color image. Diagrams illustrating the visualization procedure are presented for each application.

Journal ArticleDOI
TL;DR: Experimental results with histogram equalization demonstrate that the use of a higher resolution histogram leads to reduced distortion as well as a "flatter" output histogram.

Patent
Naoki Manabe1
27 Jan 1995
TL;DR: An image reading apparatus forms a histogram of image density levels of image signals obtained by reading an image, and processes the image signals in accordance with the histogram as mentioned in this paper, which indicates the improperness.
Abstract: An image reading apparatus forms a histogram of image density levels of image signals obtained by reading an image, and processes the image signals in accordance with the histogram. When the histogram is improper for image processing, the apparatus indicates the improperness.

Journal ArticleDOI
TL;DR: The proposed palette restoration algorithm is based on stochastic regularization using a non-Gaussian Markov random field model for the image data and results in a constrained optimization algorithm which is solved using an iterative constrained gradient descent computational algorithm.

Proceedings ArticleDOI
23 Oct 1995
TL;DR: An application of fuzzy set theory to the color image analysis is presented and it is developed to inference the weather state from a color image.
Abstract: An application of fuzzy set theory to the color image analysis is presented in this paper. It is developed to inference the weather state from a color image. For a color image, its RGB components are first converted into HSI components. For each component, a 3-layer quadtree is created for image analysis. Then the histogram of each block image from the top layer to bottom layer is computed. The histogram is further converted into a membership function by using the scheme of a 6-parameter fuzzy number. The fuzzy inference for the given image is thus achieved by using the generated membership functions.

Journal ArticleDOI
TL;DR: In the present discussion a system is proposed for the equivalent processing of general resolution elements (resells), instead of the homogeneous picture elements (pixels) which are found in image processing systems.
Abstract: There now exist several microcomputer processing systems which incorporate algorithms and display techniques appropriate for the class of objects known as images Zooming, histogram equalization, contrast stretching, ratioing, edge enhancement, and filtering are common options used in these systems In the present discussion a system is proposed for the equivalent processing of general resolution elements (resells), instead of the homogeneous picture elements (pixels) which are found in image processing systems The resel processing system requires a new, generalized, repertoire of processing algorithms and a high resolution display A summary is provided of the facilities and procedures required, a computational metaphor using computer spreadsheets is described, and the applicability to censels (census data elements) and to medical data is suggested

Proceedings ArticleDOI
20 Jun 1995
TL;DR: A new method for preprocessing and eventually fusing a set of multispectral images using histogram equalization, which is found to be ideally suited for this exercise.
Abstract: Texture detection using a multispectral approach is naturally superior to a unispectral one because the multispectral process takes more information into account. Details not obvious in one image may be more prominent in others, hence improving the chances of recognition and detection. In this paper we present a new method for preprocessing and eventually fusing a set of multispectral images. Images are preprocessed using histogram equalization, which is found to be ideally suited for this exercise. A wavelet transform technique is used to fuse data from the different multispectral images.© (1995) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Patent
26 Jul 1995
TL;DR: In this article, an analog luminance signal is converted into a digital one by an A/D converter and a histogram generator and a coefficient memory is used to detect a uniform area.
Abstract: PROBLEM TO BE SOLVED: To correctly eliminate noise by correctly detecting a uniform area. SOLUTION: An analog luminance signal Ya is converted into a digital luminance signal Yd by an A/D converter 10 and supplied to a differentiator 11, a histogram generator 12 and a coefficient memory 15. An absolute value (absolute differentiated value) adv of the second-order differentiated value of the digital luminance signal Yd is calculated by the differentiator 11 and supplied to the histogram generator 12. The histogram generator 12 counts the luminance level of a digital luminance signal Yd (i, j) having the absolute differentiated value smaller than a prescribed threshold value among the absolute differentiated value adv and generates a histogram H at the luminance level for one picture. A histogram smoothing unit 13 smoothes the histogram H and generates a smoothed histogram H'. Based on the smoothed histogram H', a coefficient calculator 14 calculates a uniform area coefficient and supplies a coefficient table C to the coefficient memory 15. The coefficient memory 15 makes the uniform area coefficient correspondent to each picture element position in one picture.

Proceedings ArticleDOI
22 Aug 1995
TL;DR: 3D reconstructions and 3D calculations for medical images, developed by a medical image processing system (NAI200), to solve the problems of the need for 3D reconstruction or calculation images to be preprocessed are presented.
Abstract: Digital image processing in medical applications is an important area. In this paper we present 3D reconstructions and 3D calculations for medical images. We have developed a medical image processing system (NAI200) to solve the problems of the need for 3D reconstruction or calculation images to be preprocessed. For example, zoom, negative, exponential, logarithm, histogram equalization, scale, filters, and other transfers are included. For the 3D reconstructions and 3D calculations the outlines of organs or focuses of disease are needed. In this paper we introduce some ways for autodrawing outlines. Different organs have different means. The entropy is an important method. But in the great majority manual operation may be most simple and precise. The stereo images of organs and focuses are reached, not only on the basis of slice images but also other ways. We describe these in this paper. About 3D calculation we first calculate the areas of slice images, then according to these areas to calculate volumes of organs or focuses. How to calculate a persons ventricular volumes, we also explain in this paper. All of these are reached by our NAI200 medical image processing system. Other functions of the NAI200 will be mentioned in this paper also.

Journal ArticleDOI
TL;DR: It is shown that the histogram of all the post-filtering images is heavily dependent on the statistics of the detected photopulses, and when this dependence is suppressed, the image quality is improved.
Abstract: Low-pass filtering is an already known technique used to smooth the noise in digital images. It has the advantage of simplicity, but, however, it produces poor results when it is applied to photon-limited images. In this paper we show that the histogram of all the post-filtering images is heavily dependent on the statistics of the detected photopulses. Hence, when this dependence is suppressed, the image quality is improved. The solution we propose is a histogram specification based on a fitting procedure to provide the recovered image of a histogram similar to that of the original image. The technique has been applied to very low light level simulated images (about 500 photopulses in a 359*175 pixel frame) and fairly good results have been obtained.

Journal ArticleDOI
TL;DR: This package was implemented in Visual Basic, which is a visual programming language to create applications in a Windows environment and includes manipulation in the space and frequency domains, such as 2‐D convolution, gray‐level transformations, linear filtering, histogram equalization, and morphologic filtering.
Abstract: We present the educational software package ProDIM, created with the idea of supporting courses on digital image processing techniques. This package was implemented in Visual Basic, which is a visual programming language to create applications in a Windows environment. Operations available in ProDIM include manipulation in the space and frequency domains, such as 2-D convolution, gray-level transformations, linear filtering, histogram equalization, and morphologic filtering. Examples show how the package can be useful as a lecture aid and as a lab assistance tool. The image to be processed can be generated using the editor included, or can be imported using images in a BMP format. A graphical interface allows the user to activate available operations through a menu selection.