scispace - formally typeset
Search or ask a question

Showing papers on "Histogram equalization published in 2006"


Journal ArticleDOI
TL;DR: Experimental results and statistical models of the induced ordering are presented and several applications are discussed: image enhancement, normalization, watermarking, etc.
Abstract: While in the continuous case, statistical models of histogram equalization/specification would yield exact results, their discrete counterparts fail. This is due to the fact that the cumulative distribution functions one deals with are not exactly invertible. Otherwise stated, exact histogram specification for discrete images is an ill-posed problem. Invertible cumulative distribution functions are obtained by translating the problem in a K-dimensional space and further inducing a strict ordering among image pixels. The proposed ordering refines the natural one. Experimental results and statistical models of the induced ordering are presented and several applications are discussed: image enhancement, normalization, watermarking, etc.

339 citations


Patent
30 Aug 2006
TL;DR: In this article, a system and method for reliable content access using a cellular/wireless device with imaging capabilities was proposed, using part of a printed or displayed medium for identifying and using a reference to access information, services, or content related to the reference.
Abstract: A system and method for reliable content access using a cellular/wireless device with imaging capabilities, to use part of a printed or displayed medium for identifying and using a reference to access information, services, or content related to the reference, including capturing an image of the reference with an imaging device, sending the image via a communications network to a processing center, pre-processing to identify relevant frames within the image and to perform general purpose enhancement operations, detecting the most relevant frame within the image, and frame properties, applying geometric, illumination, and focus correction on the relevant frame, using color, aspect ration, and frame color, to perform a coarse recognition and thereby limit the number of possible identifications of a reference within the relevant frame, and using specific techniques of resealing, histogram equalization, block labeling, edge operation, and normalized cross correlation, to identify the reference within the image.

184 citations


Journal ArticleDOI
TL;DR: This paper will give a detailed review of quantile equalization applied to the Mel scaled filter bank, including considerations about the application in online systems and improvements through a second transformation step that combines neighboring filter channels.
Abstract: The noise robustness of automatic speech recognition systems can be improved by reducing an eventual mismatch between the training and test data distributions during feature extraction. Based on the quantiles of these distributions the parameters of transformation functions can be reliably estimated with small amounts of data. This paper will give a detailed review of quantile equalization applied to the Mel scaled filter bank, including considerations about the application in online systems and improvements through a second transformation step that combines neighboring filter channels. The recognition tests have shown that previous experimental observations on small vocabulary recognition tasks can be confirmed on the larger vocabulary Aurora 4 noisy Wall Street Journal database. The word error rate could be reduced from 45.7% to 25.5% (clean training) and from 19.5% to 17.0% (multicondition training).

129 citations


Journal ArticleDOI
TL;DR: This paper employs Fisher criterion and mutual information to measure discriminability and features correlation of spatial histogram features, and trains a hierarchical classifier by combining cascade histogram matching and support vector machine.

118 citations


Journal ArticleDOI
TL;DR: A simple and effective implementation of the proposed self-adaptive contrast enhancement algorithm based on plateau histogram equalization for infrared images, including its threshold value calculation, is described by using pipeline and parallel computation architecture.

101 citations


Proceedings ArticleDOI
11 Sep 2006
TL;DR: The automatic synthesis of aesthetically pleasing images is investigated and the use of the bell curve model often resulted in images that were harmonious and easy-on-the-eyes, and this approach does increase the likelihood that generated textures are visually interesting.
Abstract: The automatic synthesis of aesthetically pleasing images is investigated. Genetic programming with multi-objective fi tness evaluation is used to evolve procedural texture formulae. With multi-objective fi tness testing, candidate textures are evaluated according to multiple criteria. Each criteria designates a dimension of a multi-dimensional fi tness space. The main feature test uses Ralph's model of aesthetics. This aesthetic model is based on empirical analyses of fi ne art, in which analyzed art work exhibits bell curve distributions of color gradients. Subjectively speaking, this bell-curve gradient measurement tends to favor images that have harmonious and balanced visual characteristics. Another feature test is color histogram scoring. This test permits some control of the color composition, by matching a candidate texture's color composition with the color histogram of a target image. This target image may be a digital image of another artwork. We found that the use of the bell curve model often resulted in images that were harmonious and easy-on-the-eyes. Without the use of the model, generated images were often too chaotic or boring. Although our approach does not guarantee aesthetically pleasing results, it does increase the likelihood that generated textures are visually interesting.

90 citations


Journal ArticleDOI
TL;DR: Two new descriptors, color distribution entropy (CDE) and improved CDE (I-CDE), which introduce entropy to describe the spatial information of colors, are presented and results show that CDE and I-Cde give better performance than SCH and geostat.

89 citations


Journal ArticleDOI
TL;DR: A generalization of POSHE approach exploiting cascaded multistep binomial filtering HE in order to get the same LPF mask is proposed, which makes CMBFHE a suitable solution in all those consumer electronics related environments where on-the-fly local-like contrast enhancement is required.
Abstract: Global and local histogram equalization (HE) proved to be effective techniques for contrast enhancement. Local HE allows to achieve high contrast enhancement rates but its high computational complexity limits its applicability in many resource constrained scenarios. On the contrary, global HE complexity is relatively low but enhancement rate is often unsatisfactory. Recently, an algorithm exploiting a low-pass filter (LPF) mask to obtain a partially overlapped sub-block HE effect (POSHE) has been presented. POSHE allows to produce the high local HE contrast with the simplicity of global HE. In this paper a generalization of POSHE approach exploiting cascaded multistep binomial filtering HE (CMBFHE) in order to get the same LPF mask is proposed. By relying on the same filter mask, contrast enhancement capability of CMBFHE is exactly the same of POSHE. Additionally, results show that a significant speedup with respect to the previous fastest method is achieved because of the efficient implementation of the filtering approach; moreover hardware implementations can be effectively designed. This makes CMBFHE a suitable solution in all those consumer electronics related environments including camcorders, digital cameras and video surveillance where on-the-fly local-like contrast enhancement is required

65 citations


Patent
08 Feb 2006
TL;DR: A method for generating block-based image histogram from data compressed by JPEG, MPEG-1 and MPEG-2, or uncompressed image data employing blockbased linear quantization to generate histograms that include color, brightness, and edge components is presented in this paper.
Abstract: A method for generating a block-based image histogram from data compressed by JPEG, MPEG-1, and MPEG-2, or uncompressed image data employing block-based linear quantization to generate histograms that include color, brightness, and edge components The edge histogram, in particular, includes the global edge features, semi-global edge features, and local edge features The global edge histogram is based on image blocks of the entire image space The local edge histogram is based on a group of edge blocks The semi-global edge histogram is based on the horizontally and the vertically grouped image blocks A method for generating block-based image histogram with color information and brightness information of image data in accordance with an embodiment of the present invention extracts feature information of an image in terms of the block and updates global histogram bins on the basis of the feature information The method for generating block-based image histogram with color information and brightness information of image data minimizes quantization error by employing linear weight and updates values of histogram bins The error that occurs at a boundary between bins of the histograms and the linear weight depends on the distance between the histogram bins

43 citations


Proceedings ArticleDOI
Taemin Kim1, Hyun S. Yang1
08 Oct 2006
TL;DR: A novel method to extend the grayscale histogram equalization (GHE) for color images in a multi-dimension that can generate a uniform histogram, thus minimizing the disparity between the histogram and uniform distribution.
Abstract: In this paper, a novel method to extend the grayscale histogram equalization (GHE) for color images in a multi-dimension is proposed. Unlike most current techniques, the proposed method can generate a uniform histogram, thus minimizing the disparity between the histogram and uniform distribution. A histogram of any dimension is regarded as a mixture of isotropic Gaussians. This method is a natural extension of the GHE to a multi-dimension. An efficient algorithm for the histogram equalization is provided. The results show that this approach is valid, and a psycho-visual study on a target distribution will improve the practical use of the proposed method.

37 citations


Journal ArticleDOI
TL;DR: The problem of analyzing polarization-encoded images is addressed and the potential of this information for classification issues is explored and ad hoc color displays are proposed as an aid to the interpretation of physical properties content.
Abstract: In the framework of Stokes parameters imaging, polarization-encoded images have four channels which makes physical interpretation of such multidimensional structures hard to grasp at once Furthermore, the information content is intricately combined in the parameters channels which involve the need for a proper tool that allows the analysis and understanding this kind of images In this paper we address the problem of analyzing polarization-encoded images and explore the potential of this information for classification issues and propose ad hoc color displays as an aid to the interpretation of physical properties content The color representation schemes introduced hereafter employ a technique that uses novel Poincare Sphere to color spaces mapping coupled with a segmentation map as an a priori information in order to allow, at best, a distribution of the information in the appropriate color space The segmentation process relies on the fuzzy C-means clustering algorithms family where the used distances were redefined in relation with our images specificities Local histogram equalization is applied to each class in order to bring out the intra-class’s information smooth variations The proposed methods are applied and validated with Stokes images of biological tissues

Book ChapterDOI
01 Jan 2006
TL;DR: An automated algorithm for global contrast enhancement of images with multimodal histograms is presented, performed by kernel density estimation, a robust nonparametric statistical method to locate modes and valleys.
Abstract: We present an automated algorithm for global contrast enhancement of images with multimodal histograms. To locate modes and valleys, histogram analysis is performed by kernel density estimation, a robust nonparametric statistical method. Histogram warping by monotonic splines pushes the modes apart, spreading them out more evenly across the dynamic range. This technique can assist in the contrast correction of images taken facing the light source.

Proceedings ArticleDOI
20 Aug 2006
TL;DR: Experiments using real scenes show the practical usefulness of the proposed method for detecting changes between two images of the same scene taken at different times using their joint intensity histogram.
Abstract: In the present paper, a method for detecting changes between two images of the same scene taken at different times using their joint intensity histogram is proposed. First, the joint histogram, which is a two-dimensional (2D) histogram of combinatorial intensity levels, (I1(x), I2 (x)), is calculated. By checking the characteristics of the ridges of clusters on the joint histogram, clusters that are expected to correspond to background are selected. The combinations of (I1, I2) covered by the clusters are determined as insignificant changes. Pixels having a different combinatorial intensity (I1(x), I2 (x)) from these combinations, are extracted as candidates for significant changes. Based on the gradient correlation between the images for each region consisting of these pixels, only regions with significant changes are distinguished. Experiments using real scenes show the practical usefulness of the method

Proceedings ArticleDOI
14 May 2006
TL;DR: An automatic method is presented to enhance the visibility of dark areas of an image while preserving its natural look with good overall enhancement with local adaptability without excessive complexity.
Abstract: An automatic method is presented to enhance the visibility of dark areas of an image while preserving its natural look. This method consists of three major parts: classification, global adjustment and local adjustment. Firstly, an image is classified into one of several types for different global/local processing or no processing. The method adaptively maps a luminance value into a value based on a piecewise linear mapping curve in order to increase contrast in dark areas. The global and local mapping curves are based on histogram equalization with modifications. The local enhancement is non-overlapped block-based and is applied only when necessary. To avoid blocking artifacts generated by block-based local adjustment, the luminance value of each pixel is adjusted according to an interpolated mapping curve derived from the block mapping curves of nearby blocks. The combination of global and local enhancements achieves good overall enhancement with local adaptability without excessive complexity.

Patent
Joonki Paik1
22 Nov 2006
Abstract: An image processing method and system using gain-controllable clipped histogram equalization, in which the image processing method includes the steps of as obtaining a brightness histogram, computing a mean brightness of an image signal, determining a clipping rate based on the mean brightness, determining a clipping threshold based on the clipping rate, obtaining a clipped brightness histogram by clipping frequencies exceeding the clipping threshold in the brightness histogram, obtaining a corrected brightness histogram by correcting the clipped brightness as histogram using the clipping rate as a total gain, obtaining a cumulative histogram from the corrected brightness histogram, and correcting an input image using the cumulative histogram as a transformation function Accordingly, the clipping rate is adaptively controlled so that image contrast is enhanced

Proceedings Article
01 Jan 2006
TL;DR: A way to keep the advantages of histograms avoiding their inherent drawbacks using local kernel histograms is presented and this approach is tested for background subtraction using indoor and outdoor sequences.
Abstract: In addition to being invariant to image rotation and translation, histograms have the advantage of being easy to compute. These advantages make histograms very popular in computer vision. However, without data quantization to reduce size, histograms are generally not suitable for realtime applications. Moreover, they are sensitive to quantization errors and lack any spatial information. This paper presents a way to keep the advantages of histograms avoiding their inherent drawbacks using local kernel histograms. This approach is tested for background subtraction using indoor and outdoor sequences.

Proceedings ArticleDOI
01 Jun 2006
TL;DR: It is shown that the contrast enhancement method based on contourlet transform is superior to both histogram equalization and the methodbased on wavelet transform, in both enhancement of fine structures in the image and denoising.
Abstract: We propose a new method for image contrast enhancement, based on a recently introduced 2D directional non-separable transform known as contourlet transform. Conventional 2D wavelet transform is separable and thus can not sparsely represent non-separable structures of the image, such as directional curves. The directionality feature of contourlet transform makes it a good choice for representation of curves and edges in the image. In this paper, a new enhancement function is proposed to enhance the edges by contourlet transform. The method is also compared with two usual contrast enhancement methods, histogram equalization and wavelet based contrast enhancement. It is shown that the contrast enhancement method based on contourlet transform is superior to both histogram equalization and the method based on wavelet transform, in both enhancement of fine structures in the image and denoising.

Patent
17 Oct 2006
TL;DR: In this article, a method and apparatus for transforming a first color distribution based on a second color distribution is presented, which includes the steps of determining, for each of the first and second color distributions, a one-dimensional histogram along a direction in a color space.
Abstract: A method and apparatus for transforming a first color distribution based on a second color distribution. The method includes the steps of determining, for each of the first and second color distributions, a one-dimensional histogram along a direction in a color space; matching the one-dimensional histogram determined for the first color distribution and the one-dimensional histogram determined for the second color distribution so as to generate a transform mapping; transforming the first color distribution based on the generated transform mapping; and repeating the determining, matching, and transforming steps for other directions in the color space until the generated transform mapping converges.

Proceedings ArticleDOI
01 Sep 2006
TL;DR: The results corroborate with the hypothesis that the nonlinear memoryless time interval transform proposed here, despite its simplicity, can be a useful and almost costless building block in keystroke-based biometric systems.
Abstract: The effect of parametric equalization of time interval histograms (key down-down intervals) on the performance of keystroke-based user verification algorithms is analyzed. Four algorithms are used throughout this analysis: a classic one for static (structured) texts, a second one, also proposed in literature, for both static and arbitrary (free) text, a new one for arbitrary text based verification, and an algorithm recently proposed, where keystroke timing is indirectly addressed in order to compare user dynamics. The algorithms performances are presented before and after time interval histogram equalization, and the results corroborate with the hypothesis that the nonlinear memoryless time interval transform proposed here, despite its simplicity, can be a useful and almost costless building block in keystroke-based biometric systems.

Proceedings ArticleDOI
22 Nov 2006
TL;DR: Three newly proposed histogram-based methods are compared with other three popular methods, including conventional histogram intersection method, Wong and Cheung's merged palette histogram matching method, and Gevers' colour ratio gradient (CRG) method and overall, the CECH method produces the best performance when both speed and classification performance are concerned.
Abstract: Using colour histogram as a stable representation over change in view has been widely used for object recognition. In this paper, three newly proposed histogram-based meth- ods are compared with other three popular methods, includ- ing conventional histogram intersection (HI) method, Wong and Cheung?s merged palette histogram matching (MPHM) method, and Gevers? colour ratio gradient (CRG) method. These methods are tested on vehicle number plate images for number plate classification. Experimental results dis- close that, the CRG method is the best choice in terms of speed, and the GWHI method can give the best classifi- cation results. Overall, the CECH method produces the best performance when both speed and classification per- formance are concerned.

Journal ArticleDOI
TL;DR: Modified Segmental Histogram Equalization is proposed to improve the robustness of a speaker verification system operating in telephone environments by transforming the features extracted from short adjacent segments of speech within an utterance such that their statistics conform to that of a Gaussian distribution with zero mean and unity variance across all recording conditions.

Patent
29 Dec 2006
TL;DR: In this article, an adaptive histogram equalization method for contrast enhancement of images based on adaptive HOG equalization is presented. But this method is not suitable for detecting fading artifacts and object extension artifacts.
Abstract: The present invention relates to a method, apparatus and computer program product for contrast enhancement of images based on adaptive histogram equalization In particular it relates to preventing adaptive histogram equalization from causing fading artifacts and object extension artifacts An adaptive histogram equalization method is provided comprising the steps of dividing an image into regions of pixels, determining structures of local pixel value differences of a predefined strength of the image, building for every region a histogram of the pixel values based on the determined structures of local pixel value differences and mapping pixel values of each region based on the histogram corresponding to the region

Journal ArticleDOI
TL;DR: A new strategy of watermarking which extends the principle of histogram specification to color histogram and shows that the detection ability the invisibility, as well as the robustness to some common image processing are improved.
Abstract: In this paper we propose a new strategy of watermarking which extends the principle of histogram specification to color histogram The proposed scheme embeds into a color image a color watermark from either the xy chromatic plane or the xyY color space The scheme resists geometric attacks (eg, rotation, scaling, etc,) and, within some limits, JPEG compression The scheme uses a secret binary pattern, or combines some patterns generated by a secret key in order to modify the chromatic distribution of an image By using the inverse pattern, the watermark is detected without knowing the original image Examples of images and attacks are given to illustrate the relevance of the proposed approach, ie, its invisibility and its robustness In the second part of this paper we investigate the usefulness of our watermarking approach for color image authentication Several experiments are presented to show that our scheme ensures image authentication, detects tampered regions in case of malicious attacks and ensures a certain degree of robustness to common image manipulations like JPEG compression, etc Compared with other blind authentication schemes, the experiments show that, the detection ability, the invisibility, as well as the robustness to some common image processing are improved

Proceedings ArticleDOI
20 Aug 2006
TL;DR: A Gaussian weighted histogram intersection (GWHI) algorithm is proposed to facilitate the histogram matching via taking into account matching of both identical and similar colors.
Abstract: The conventional histogram intersection (HI) algorithm computes the intersected section of the corresponding color histograms in order to measure the matching rate between two color images. Since this algorithm is strictly based on the matching between bins of identical colors, the final matching rate can be easily affected by color variation caused by various environment changes. In this paper, a Gaussian weighted histogram intersection (GWHI) algorithm is proposed to facilitate the histogram matching via taking into account matching of both identical and similar colors. The weight is determined by the distance between two colors. The algorithm is applied to license plate classification. Experimental results show that the proposed algorithm produces a much lower intra-class distance and a much higher inter-class distance than previous HI algorithms for tested images which are captured under various illumination conditions.

Book ChapterDOI
10 Dec 2006
TL;DR: The experimental results indicate that the proposed adaptively modified histogram equalization method not only enhances contrast effectively, but also keeps the tone of the original image.
Abstract: A new contrast enhancement method called adaptively modified histogram equalization (AMHE) is proposed as an extension of typical histogram equalization. To prevent any significant change of gray levels between the original image and the histogram equalized image, the AMHE scales the magnitudes of the probability density function of the original image before equalization. The scale factor is determined adaptively based on the mean brightness of the original image. The experimental results indicate that the proposed method not only enhances contrast effectively, but also keeps the tone of the original image.

Proceedings ArticleDOI
07 Jun 2006
TL;DR: This paper presents a novel approach to boundary detection of regions-of-interest (ROI) in ultrasound images, more specifically applied to ultrasound breast images, and compares the performance of the algorithm with two well known methods.
Abstract: This paper presents a novel approach to boundary detection of regions-of-interest (ROI) in ultrasound images, more specifically applied to ultrasound breast images. In the proposed method, histogram equalization is used to preprocess the ultrasound images followed by a hybrid filtering stage that consists of a combination of a nonlinear diffusion filter and a linear filter. Subsequently the multifractal dimension is used to analyse the visually distinct areas of the ultrasound image. Finally, using different threshold values, region growing segmentation is used to the partition the image. The partition with the highest Radial Gradient Index (RGI) is selected as the lesion. A total of 200 images have been used in the analysis of the presented results. We compare the performance of our algorithm with two well known methods proposed by Kupinski et al. and Joo et al. We show that the proposed method performs better in solving the boundary detection problem in ultrasound images.

Journal ArticleDOI
TL;DR: The proposed approach to tone reproduction by histogram equalization of macro edges has been integrated with a robust image processing pipeline, and the experimental results show that the proposed algorithm is very stable for accommodating diverse illuminants and scenes.
Abstract: Tone reproduction attempts to scale or map high dynamic range image data such that the resulting image has preserved the visual brightness and better contrast impression of the original scenes. In this paper, we propose a systematic approach to tone reproduction by histogram equalization of macro edges. The proposed approach has been integrated with a robust image processing pipeline, and the experimental results show that the proposed algorithm is very stable for accommodating diverse illuminants and scenes.

Proceedings ArticleDOI
01 Jan 2006
TL;DR: The use of the data fitting scheme was explored to efficiently approximate the inverse of the cumulative density function of training speech for HEQ, in contrast to the conventional tablelookup or quantile based approaches.
Abstract: The performance of current automatic speech recognition (ASR) systems radically deteriorates when the input speech is corrupted by various kinds of noise sources. Quite a few of techniques have been proposed to improve ASR robustness in the past several years. Histogram equalization (HEQ) is one of the most efficient techniques that have been used to compensate the nonlinear distortion. In this paper, we explored the use of the data fitting scheme to efficiently approximate the inverse of the cumulative density function of training speech for HEQ, in contrast to the conventional tablelookup or quantile based approaches. Moreover, the temporal average operation was also performed on the feature vector components to alleviate the influence of sharp peaks and valleys that were caused by non-stationary noises. Finally, we also investigated the possibility of combining our approaches with other feature discrimination and decorrelation methods. All experiments were carried out on the Aurora-2 database and task. Encouraging results were initially demonstrated. Index Terms: histogram equalization, data fitting, temporal average, robustness

Journal Article
TL;DR: In this article, a new contrast enhancement method called adaptively modified histogram equalization (AMHE) is proposed, which scales the magnitudes of the probability density function of the original image before equalization.
Abstract: A new contrast enhancement method called adaptively modified histogram equalization (AMHE) is proposed as an extension of typical histogram equalization. To prevent any significant change of gray levels between the original image and the histogram equalized image, the AMHE scales the magnitudes of the probability density function of the original image before equalization. The scale factor is determined adaptively based on the mean brightness of the original image. The experimental results indicate that the proposed method not only enhances contrast effectively, but also keeps the tone of the original image.

Proceedings ArticleDOI
01 Aug 2006
TL;DR: This paper proposes a new color space quantization arithmetic and a new histogram similarity function with respect to the shortcoming and makes a prototype system for image indexing.
Abstract: Color features are important to pictures and they are easy to calculate. Therefore, the features are widely used in content-based image retrieval (CBIR). A color histogram describes the frequency of colors in images. It won't change with the variations of pictures' geometry, so it is a widely used feature for image indexing although it has some shortcomings. In this paper, we propose a new color space quantization arithmetic and a new histogram similarity function with respect to the shortcoming and make a prototype system for image indexing. The experiments show that the new color image retrieval algorithm is effective and robust.