scispace - formally typeset
Search or ask a question

Showing papers on "Median filter published in 2016"


Journal Article
TL;DR: In this paper, the first stage of many stereo algorithms, matching cost computation, is addressed by learning a similarity measure on small image patches using a convolutional neural network, and then a series of post-processing steps follow: cross-based cost aggregation, semiglobal matching, left-right consistency check, subpixel enhancement, a median filter, and a bilateral filter.
Abstract: We present a method for extracting depth information from a rectified image pair. Our approach focuses on the first stage of many stereo algorithms: the matching cost computation. We approach the problem by learning a similarity measure on small image patches using a convolutional neural network. Training is carried out in a supervised manner by constructing a binary classification data set with examples of similar and dissimilar pairs of patches. We examine two network architectures for this task: one tuned for speed, the other for accuracy. The output of the convolutional neural network is used to initialize the stereo matching cost. A series of post-processing steps follow: cross-based cost aggregation, semiglobal matching, a left-right consistency check, subpixel enhancement, a median filter, and a bilateral filter. We evaluate our method on the KITTI 2012, KITTI 2015, and Middlebury stereo data sets and show that it outperforms other approaches on all three data sets.

860 citations


01 Jan 2016
TL;DR: The advanced digital signal processing and noise reduction is universally compatible with any devices to read and can be downloaded instantly from the authors' digital library.
Abstract: advanced digital signal processing and noise reduction is available in our digital library an online access to it is set as public so you can download it instantly. Our books collection spans in multiple locations, allowing you to get the most less latency time to download any of our books like this one. Kindly say, the advanced digital signal processing and noise reduction is universally compatible with any devices to read.

197 citations


01 Jan 2016
TL;DR: Thank you very much for reading advanced digital signal processing and noise reduction, maybe you have knowledge that, people have search hundreds of times for their chosen books, but end up in infectious downloads, instead they are facing with some infectious bugs inside their laptop.
Abstract: Thank you very much for reading advanced digital signal processing and noise reduction. Maybe you have knowledge that, people have search hundreds times for their chosen books like this advanced digital signal processing and noise reduction, but end up in infectious downloads. Rather than reading a good book with a cup of coffee in the afternoon, instead they are facing with some infectious bugs inside their laptop.

195 citations


Journal ArticleDOI
TL;DR: The class of generalized Hampel filters obtained by applying the median filter extensions listed above are defined, and an important concept introduced here is that of an implosion sequence, a signal for which generalized Hampels filter performance is independent of the threshold parameter t.
Abstract: The standard median filter based on a symmetric moving window has only one tuning parameter: the window width. Despite this limitation, this filter has proven extremely useful and has motivated a number of extensions: weighted median filters, recursive median filters, and various cascade structures. The Hampel filter is a member of the class of decsion filters that replaces the central value in the data window with the median if it lies far enough from the median to be deemed an outlier. This filter depends on both the window width and an additional tuning parameter t, reducing to the median filter when t=0, so it may be regarded as another median filter extension. This paper adopts this view, defining and exploring the class of generalized Hampel filters obtained by applying the median filter extensions listed above: weighted Hampel filters, recursive Hampel filters, and their cascades. An important concept introduced here is that of an implosion sequence, a signal for which generalized Hampel filter performance is independent of the threshold parameter t. These sequences are important because the added flexibility of the generalized Hampel filters offers no practical advantage for implosion sequences. Partial characterization results are presented for these sequences, as are useful relationships between root sequences for generalized Hampel filters and their median-based counterparts. To illustrate the performance of this filter class, two examples are considered: one is simulation-based, providing a basis for quantitative evaluation of signal recovery performance as a function of t, while the other is a sequence of monthly Italian industrial production index values that exhibits glaring outliers.

153 citations


Journal ArticleDOI
TL;DR: A novel structural-oriented median filter is proposed to use to attenuate the blending noise along the structural direction of seismic profiles to separate the simultaneous-source data into individual sources.

93 citations


Journal ArticleDOI
TL;DR: In this paper, the authors describe the realization of a sub-shot noise wide field microscope based on spatially multi-mode non-classical photon number correlations in twin beams, which produces real time images of 8000 pixels at full resolution, for (500micrometers)2 field-of-view, with noise reduced to the 80% of the shot noise level.
Abstract: In the last years several proof of principle experiments have demonstrated the advantages of quantum technologies respect to classical schemes. The present challenge is to overpass the limits of proof of principle demonstrations to approach real applications. This letter presents such an achievement in the field of quantum enhanced imaging. In particular, we describe the realization of a sub-shot noise wide field microscope based on spatially multi-mode non-classical photon number correlations in twin beams. The microscope produces real time images of 8000 pixels at full resolution, for (500micrometers)2 field-of-view, with noise reduced to the 80% of the shot noise level (for each pixel), suitable for absorption imaging of complex structures. By fast post-elaboration, specifically applying a quantum enhanced median filter, the noise can be further reduced (less than 30% of the shot noise level) by setting a trade-off with the resolution, demonstrating the best sensitivity per incident photon ever achieved in absorption microscopy.

90 citations


Journal ArticleDOI
Yan Li1, Rui Zhu1, Lei Mi1, Cao Yihui1, Di Yao 
TL;DR: The results show that the performance of the proposed method is better than single-threshold approach independently performed in RGB and HSV color space and the overall single WBC segmentation accuracy reaches 97.85%, showing a good prospect in subsequent lymphoblast classification and ALL diagnosis.
Abstract: We propose a dual-threshold method based on a strategic combination of RGB and HSV color space for white blood cell (WBC) segmentation. The proposed method consists of three main parts: preprocessing, threshold segmentation, and postprocessing. In the preprocessing part, we get two images for further processing: one contrast-stretched gray image and one H component image from transformed HSV color space. In the threshold segmentation part, a dual-threshold method is proposed for improving the conventional single-threshold approaches and a golden section search method is used for determining the optimal thresholds. For the postprocessing part, mathematical morphology and median filtering are utilized to denoise and remove incomplete WBCs. The proposed method was tested in segmenting the lymphoblasts on a public Acute Lymphoblastic Leukemia (ALL) image dataset. The results show that the performance of the proposed method is better than single-threshold approach independently performed in RGB and HSV color space and the overall single WBC segmentation accuracy reaches 97.85%, showing a good prospect in subsequent lymphoblast classification and ALL diagnosis.

85 citations


Journal ArticleDOI
TL;DR: This study combines local statistics with the NLM filter to reduce speckle in ultrasound images and demonstrates that the proposed method outperforms the original NLM, as well as many previously developed methods.

83 citations


Journal ArticleDOI
TL;DR: RGB image has given better clarity and noise free image which is suitable for infected leaf detection than Grayscale image.
Abstract: Background/Objectives: Digital image processing is used various fields for analyzing different applications such as medical sciences, biological sciences. Various image types have been used to detect plant diseases. This work is analyzed and compared two types of images such as Grayscale, RGB images and the comparative result is given. Methods/Statistical Analysis: We examined and analyzed the Grayscale and RGB images using image techniques such as pre processing, segmentation, clustering for detecting leaves diseases. Results/Finding: In detecting the infected leaves, color becomes an important feature to identify the disease intensity. We have considered Grayscale and RGB images and used median filter for image enhancement and segmentation for extraction of the diseased portion which are used to identify the disease level. Conclusion: RGB image has given better clarity and noise free image which is suitable for infected leaf detection than Grayscale image.

74 citations


Journal ArticleDOI
TL;DR: This paper aims at reducing the noise using the Kalman filter by building an image model based on Markov random field and introducing a multi-innovation to improve the filtering/smoothing performance.

73 citations


Journal ArticleDOI
TL;DR: A new noise filtering method that combines several filtering strategies in order to increase the accuracy of the classification algorithms used after the filtering process and introduces a noisy score to control the filtering sensitivity.

Journal ArticleDOI
TL;DR: Experiments demonstrate that the proposed probability-based non-local means filter is competitive with other state-of-the-art speckle removal techniques and able to accurately preserve edges and structural details with small computational cost.
Abstract: In this Letter, a probability-based non-local means filter is proposed for speckle reduction in optical coherence tomography (OCT). Originally developed for additive white Gaussian noise, the non-local means filter is not suitable for multiplicative speckle noise suppression. This Letter presents a two-stage non-local means algorithm using the uncorrupted probability of each pixel to effectively reduce speckle noise in OCT. Experiments on real OCT images demonstrate that the proposed filter is competitive with other state-of-the-art speckle removal techniques and able to accurately preserve edges and structural details with small computational cost.

Journal ArticleDOI
TL;DR: Two image filtering methods, playing the roles of denoising and maintaining detail information are utilised in the new algorithm, and the parameters for balancing these two parts are computed by measuring the variance of grey-level values in each neighbourhood.
Abstract: Adding spatial penalty terms in fuzzy c-means (FCM) models is an important approach for reducing the noise effects in the process of image segmentation. Though these algorithms have improved the robustness to noises in a certain extent, they still have some shortcomings. First, they are usually very sensitive to the parameters which are supposed to be tuned according to noise intensities. Second, in the case of inhomogeneous noises, using a constant parameter for different image regions is obviously unreasonable and usually leads to an unideal segmentation result. For overcoming these drawbacks, a noise detecting-based adaptive FCM for image segmentation is proposed in this study. Two image filtering methods, playing the roles of denoising and maintaining detail information are utilised in the new algorithm. The parameters for balancing these two parts are computed by measuring the variance of grey-level values in each neighbourhood. Numerical experiments on both synthetic and real-world image data show that the new algorithm is effective and efficient.

Journal ArticleDOI
TL;DR: A new scheme of enhancement method for X-ray image is presented, which consists of a fuzzy noise removal method and a homomorphic filtering method that overwhelms those of the existing methods.

Journal ArticleDOI
TL;DR: The results show that the proposed retinal fundus image enhancement method can directly enhance color images prominently and is different from some other fundu image enhancement methods.
Abstract: Retinal fundus image plays an important role in the diagnosis of retinal related diseases. The detailed information of the retinal fundus image such as small vessels, microaneurysms, and exudates may be in low contrast, and retinal image enhancement usually gives help to analyze diseases related to retinal fundus image. Current image enhancement methods may lead to artificial boundaries, abrupt changes in color levels, and the loss of image detail. In order to avoid these side effects, a new retinal fundus image enhancement method is proposed. First, the original retinal fundus image was processed by the normalized convolution algorithm with a domain transform to obtain an image with the basic information of the background. Then, the image with the basic information of the background was fused with the original retinal fundus image to obtain an enhanced fundus image. Lastly, the fused image was denoised by a two-stage denoising method including the fourth order PDEs and the relaxed median filter. The retinal image databases, including the DRIVE database, the STARE database, and the DIARETDB1 database, were used to evaluate image enhancement effects. The results show that the method can enhance the retinal fundus image prominently. And, different from some other fundus image enhancement methods, the proposed method can directly enhance color images.

Journal ArticleDOI
TL;DR: An effective single-image-based algorithm to accurately remove strip-type noise present in infrared images without causing blurring effects is introduced and is compared with the state-of-the-art 1-D and 2-D denoising algorithms using captured infrared images.
Abstract: Infrared images typically contain obvious strip noise. It is a challenging task to eliminate such noise without blurring fine image details in low-textured infrared images. In this paper, we introduce an effective single-image-based algorithm to accurately remove strip-type noise present in infrared images without causing blurring effects. First, a 1-D row guided filter is applied to perform edge-preserving image smoothing in the horizontal direction. The extracted high-frequency image part contains both strip noise and a significant amount of image details. Through a thermal calibration experiment, we discover that a local linear relationship exists between infrared data and strip noise of pixels within a column. Based on the derived strip noise behavioral model, strip noise components are accurately decomposed from the extracted high-frequency signals by applying a 1-D column guided filter. Finally, the estimated noise terms are subtracted from the raw infrared images to remove strips without blurring image details. The performance of the proposed technique is thoroughly investigated and is compared with the state-of-the-art 1-D and 2-D denoising algorithms using captured infrared images.

Proceedings ArticleDOI
01 Nov 2016
TL;DR: A modified Canny algorithm where Gaussian smoothing is replaced by modified median filter that successfully removes speckle noise with little degradation of edges followed by weak weighted smoothing filter that in a controlled way removes other noise, again with insignificant damage to the edges is proposed.
Abstract: Ultrasound medical images are very important component of the diagnostics process. They are widely used since ultrasound is a non-invasive and non-ionizing diagnostics method. As a part of image analysis, edge detection is often used for further segmentation or more precise measurements of elements in the picture. Edges represent high frequency components of an image. Unfortunately, ultrasound images are subject to degradations, especially speckle noise which is also a high frequency component. That poses a problem for edge detection algorithms since filters for noise removal also degrade edges. Canny operator is widely used as an excellent edge detector, however it also includes Gaussian smoothing element that may significantly soften edges. In this paper we propose a modified Canny algorithm where Gaussian smoothing is replaced by modified median filter that successfully removes speckle noise with little degradation of edges followed by weak weighted smoothing filter that in a controlled way removes other noise, again with insignificant damage to the edges. Our proposed algorithm was tested on standard benchmark image and compared to other approaches from literature where it proved to be successful in precisely determining edges of internal organs.

Proceedings ArticleDOI
01 Feb 2016
TL;DR: K-Means segmentation with preprocessing of image contains de-noising by Median filter and skull masking is used and SVM (Support Vector Machine) is used in unsupervised manner to make this system an adaptive brain tumor detection.
Abstract: In this paper we propose adaptive brain tumor detection, Image processing is used in the medical tools for detection of tumor, only MRI images are not able to identify the tumorous region in this paper we are using K-Means segmentation with preprocessing of image. Which contains de-noising by Median filter and skull masking is used. Also we are using object labeling for more detailed information of tumor region. To make this system an adaptive we are using SVM (Support Vector Machine), SVM is used in unsupervised manner which will use to create and maintain the pattern for future use. Also for patterns we have to find out the feature to train SVM. For that here we have find out the texture feature and color features. It is expected that the experimental results of the proposed system will give better result in comparison to other existing systems.

Journal ArticleDOI
TL;DR: This paper proposes a unified framework of content-adaptive estimation and reduction for compression noise via low-rank decomposition of similar image patches, which improves the quality of compressed images obviously for post-processing, but are also helpful for computer vision tasks as a pre-processing method.
Abstract: Images coded at low bit rates in real-world applications usually suffer from significant compression noise, which significantly degrades the visual quality. Traditional denoising methods are not suitable for the content-dependent compression noise, which usually assume that noise is independent and with identical distribution. In this paper, we propose a unified framework of content-adaptive estimation and reduction for compression noise via low-rank decomposition of similar image patches. We first formulate the framework of compression noise reduction based upon low-rank decomposition. Compression noises are removed by soft thresholding the singular values in singular value decomposition of every group of similar image patches. For each group of similar patches, the thresholds are adaptively determined according to compression noise levels and singular values. We analyze the relationship of image statistical characteristics in spatial and transform domains, and estimate compression noise level for every group of similar patches from statistics in both domains jointly with quantization steps. Finally, quantization constraint is applied to estimated images to avoid over-smoothing. Extensive experimental results show that the proposed method not only improves the quality of compressed images obviously for post-processing, but are also helpful for computer vision tasks as a pre-processing method.

Journal ArticleDOI
TL;DR: The bitonic filter, which has better edge and detail preserving properties than a median, noise reduction capability similar to a Gaussian, and is applicable to many signal and noise types, gives good visual results in all circumstances.
Abstract: A new filter is presented which has better edge and detail preserving properties than a median, noise reduction capability similar to a Gaussian, and is applicable to many signal and noise types. It is built on a definition of signal as bitonic , i.e., containing only one local maxima or minima within the filter range. This definition is based on data ranking rather than value; hence, the bitonic filter comprises a combination of non-linear morphological and linear operators. It has no data-level-sensitive parameters and can locally adapt to the signal and noise levels in an image, precisely preserving both smooth and discontinuous signals of any level when there is no noise, but also reducing noise in other areas without creating additional artifactual noise. Both the basis and the performance of the filter are examined in detail, and it is shown to be a significant improvement on the Gaussian and median. It is also compared over various noisy images to the image-guided filter, anisotropic diffusion, non-local means, the grain filter, and self-dual forms of leveling and rank filters. In terms of signal-to-noise, the bitonic filter outperforms all these except non-local means, and sometimes anisotropic diffusion. However, it gives good visual results in all circumstances, with characteristics which make it appropriate particularly for signals or images with varying noise, or features at varying levels. The bitonic has very few parameters, does not require optimization nor prior knowledge of noise levels, does not have any problems with stability, and is reasonably fast to implement. Despite its non-linearity, it hence represents a very practical operation with general applicability.

Journal ArticleDOI
TL;DR: Experimental results show that the proposed method can efficiently remove salt-and-pepper noise from a corrupted image for different noise corruption densities (from 10% to 90%); meanwhile, the denoised image is freed from the blurred effect.

Journal ArticleDOI
TL;DR: An adaptive de-noising method by using the multilayered Pulse Coupled Neural Network (PCNN) and an improved median filtering method which only uses uncontaminated pixels to determine the median is presented.

Journal ArticleDOI
TL;DR: Experimental results on test images demonstrate that the proposed approach can provide better imperceptibility and robustness against various attacks, such as additive white Gaussian noise, salt & pepper noise, median filtering, JPEG compression, rotation, and scaling, in comparison with the recently proposed techniques.

Proceedings ArticleDOI
26 Jun 2016
TL;DR: This work presents a test rig which repetitively records printed test patterns, along with a method for averaging over repeated recordings to estimate the likelihood of an event being signal or noise, and shows how the choice of best filter and parameters varies as a function of the stimulus.
Abstract: Bio-inspired Address Event Representation change detection image sensors, also known as silicon retinae, have matured to the point where they can be purchased commercially, and are easily operated by laymen. Noise is present in the output of these sensors, and improved noise filtering will enhance performance in many applications. A novel approach is proposed for quantifying the quality of data received from a silicon retina, and quantifying the performance of different noise filtering algorithms. We present a test rig which repetitively records printed test patterns, along with a method for averaging over repeated recordings to estimate the likelihood of an event being signal or noise. The calculated signal and noise probabilities are used to quantitatively compare the performance of 8 different filtering algorithms while varying each filter's parameters. We show how the choice of best filter and parameters varies as a function of the stimulus, particularly the temporal rate of change of intensity for a pixel, especially when the assumption of sharp temporal edges is not valid.

Proceedings ArticleDOI
01 Aug 2016
TL;DR: A novel idea is proposed for successful identification of the brain tumor using normalized histogram and segmentation using K-means clustering algorithm and Naïve Bayes Classifier and Support Vector Machine so as to provide accurate prediction and classification.
Abstract: Magnetic resonance imaging (MRI) is a technique which is used for the evaluation of the brain tumor in medical science. In this paper, a methodology to study and classify the image de-noising filters such as Median filter, Adaptive filter, Averaging filter, Un-sharp masking filter and Gaussian filter is used to remove the additive noises present in the MRI images i.e. Gaussian, Salt & pepper noise and speckle noise. The de-noising performance of all the considered strategies is compared using PSNR and MSE. A novel idea is proposed for successful identification of the brain tumor using normalized histogram and segmentation using K-means clustering algorithm. Efficient classification of the MRIs is done using Naive Bayes Classifier and Support Vector Machine (SVM) so as to provide accurate prediction and classification.

Journal ArticleDOI
TL;DR: The problem of noise reduction is addressed as a linear filtering problem in a novel way by using concepts from subspace-based enhancement methods, resulting in variable span linear filters that can be expressed as special cases of variable span filters.
Abstract: In this paper, the problem of noise reduction is addressed as a linear filtering problem in a novel way by using concepts from subspace-based enhancement methods, resulting in variable span linear filters. This is done by forming the filter coefficients as linear combinations of a number of eigenvectors stemming from a joint diagonalization of the covariance matrices of the signal of interest and the noise. The resulting filters are flexible in that it is possible to trade off distortion of the desired signal for improved noise reduction. This tradeoff is controlled by the number of eigenvectors included in forming the filter. Using these concepts, a number of different filter designs are considered, like minimum distortion, Wiener, maximum SNR, and tradeoff filters. Interestingly, all these can be expressed as special cases of variable span filters. We also derive expressions for the speech distortion and noise reduction of the various filter designs. Moreover, we consider an alternative approach, wherein the filter is designed for extracting an estimate of the noise signal, which can then be extracted from the observed signals, which is referred to as the indirect approach. Simulations demonstrate the advantages and properties of the variable span filter designs, and their potential performance gain compared to widely used speech enhancement methods.

Journal ArticleDOI
Xiaotian Wang1, Shen Shanshan1, Guangming Shi1, Yuannan Xu, Zhang Peiyu1 
TL;DR: In this paper, an iterative nonlocal means filter (INLM) is proposed to exploit the image non-local similarity feature in the S&P noise removal procedure, and the proposed iterative framework update the similarity weights and the estimated values for higher accuracy.

Journal ArticleDOI
TL;DR: This technique will aid to detect edges robustly from depth images and contribute to promote applications in depth images such as object detection, object segmentation, etc.

Journal ArticleDOI
TL;DR: A new filter was created by improving the standard Kuwahara filter, which allows more efficient noise reduction without blurring the edges and image preparation for segmentation and further analyses operations.
Abstract: A new filter was created by improving the standard Kuwahara filter. It allows more efficient noise reduction without blurring the edges and image preparation for segmentation and further analyses operations. One of the biggest and most common restrictions encountered in filter algorithms is the need for a declarative definition of the filter window size or the number of iterations that an operation should be repeated. In the case of the proposed solution, we are dealing with automatic adaptation of the algorithm to the local environment of each pixel in the processed image.

Journal ArticleDOI
TL;DR: In this paper, an adaptive switching weighted median filter (ASWMF) framework was proposed to suppress salt-and-pepper (S & P) noise under a window size of 3 × 3 or 5 × 5.
Abstract: The paper presents an efficient approach for suppressing salt-and-pepper (S & P) noise under adaptive switching weighted median filter (ASWMF) framework. The ASWMF includes noise detection and noise removal stages. The proposed method first classifies a pixel into either “noise-free pixel” or “noise pixel” by checking noise candidate with the local mean value using noise detection stage. Then, the detected noisy pixels are replaced by their weighted median values using adaptive weighted median filter within a window size of 3 × 3 or 5 × 5. The proposed method is compared to several denoising schemes in terms of key performance indicators. Test results demonstrated superiority and efficiency over other methods in removing S & P noise up to percentage of 90%.