scispace - formally typeset
Search or ask a question
Topic

Histogram equalization

About: Histogram equalization is a research topic. Over the lifetime, 5755 publications have been published within this topic receiving 89313 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: The proposed methodology of determining a threshold in a gradient histogram is deduced through rigorous analysis and hence it helps in achieving consistently appreciable edge detection performance.

68 citations

Journal ArticleDOI
TL;DR: The hybrid method is suitable for avoidance of the patient identity theft/alteration/modification and secure medical document dissemination over the open channel for medical applications and is robust for hidden watermark at acceptable quality of the watermarked image.
Abstract: This paper presents a robust and secure region of interest and non-region of interest based watermarking method for medical images. The proposed method applies the combination of discrete wavelet transform and discrete cosine transforms on the cover medical image for the embedding of image and electronic patient records (EPR) watermark simultaneously. The embedding of multiple watermarks at the same time provides extra level of security and important for the patient identity verification purpose. Further, security of the image and EPR watermarks is enhancing by using message-digest (MD5) hash algorithm and Rivest---Shamir---Adleman respectively before embedding into the medical cover image. In addition, Hamming error correction code is applying on the encrypted EPR watermark to enhance the robustness and reduce the possibility bit error rates which may result into wrong diagnosis in medical environments. The robustness of the method is also extensively examined for known attacks such as salt & pepper, Gaussian, speckle, JPEG compression, filtering, histogram equalization. The method is found to be robust for hidden watermark at acceptable quality of the watermarked image. Therefore, the hybrid method is suitable for avoidance of the patient identity theft/alteration/modification and secure medical document dissemination over the open channel for medical applications.

68 citations

Journal ArticleDOI
TL;DR: The novelty of AIEBHE is its flexibility in choosing the clipping limit that automatically selects the smallest value among histogram bins, mean, and median values, resulting in the conservation of a greater amount of information in the image.

68 citations

Proceedings ArticleDOI
01 Nov 2017
TL;DR: In this article, the authors compared two different approaches to skin lesion segmentation using U-Nets and histogram equalization based preprocessing step, and C-Means clustering based approach that is simpler to implement and faster to execute.
Abstract: Many automatic skin lesion diagnosis systems use segmentation as a preprocessing step to diagnose skin conditions because skin lesion shape, border irregularity, and size can influence the likelihood of malignancy. This paper presents, examines and compares two different approaches to skin lesion segmentation. The first approach uses U-Nets and introduces a histogram equalization based preprocessing step. The second approach is a C-Means clustering based approach that is much simpler to implement and faster to execute. The Jaccard Index between the algorithm output and hand segmented images by dermatologists is used to evaluate the proposed algorithms. While many recently proposed deep neural networks to segment skin lesions require a significant amount of computational power for training (i.e., computer with GPUs), the main objective of this paper is to present methods that can be used with only a CPU. This severely limits, for example, the number of training instances that can be presented to the U-Net. Comparing the two proposed algorithms, U-Nets achieved a significantly higher Jaccard Index compared to the clustering approach. Moreover, using the histogram equalization for preprocessing step significantly improved the U-Net segmentation results.

68 citations

Journal ArticleDOI
TL;DR: An alternative formulation of histogram equalization is investigated that involves the application of a filter to a set of image fields, and the potential for filtering these fields differently is illustrated through the restoration of a deliberately degraded image.

68 citations


Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
87% related
Feature (computer vision)
128.2K papers, 1.7M citations
87% related
Image segmentation
79.6K papers, 1.8M citations
87% related
Image processing
229.9K papers, 3.5M citations
86% related
Convolutional neural network
74.7K papers, 2M citations
84% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023115
2022280
2021186
2020248
2019267
2018267