scispace - formally typeset
Search or ask a question

Showing papers on "Median filter published in 2009"


Journal ArticleDOI
TL;DR: Results on real images demonstrate that the proposed adaptation of the nonlocal (NL)-means filter for speckle reduction in ultrasound (US) images is able to preserve accurately edges and structural details of the image.
Abstract: In image processing, restoration is expected to improve the qualitative inspection of the image and the performance of quantitative image analysis techniques. In this paper, an adaptation of the nonlocal (NL)-means filter is proposed for speckle reduction in ultrasound (US) images. Originally developed for additive white Gaussian noise, we propose to use a Bayesian framework to derive a NL-means filter adapted to a relevant ultrasound noise model. Quantitative results on synthetic data show the performances of the proposed method compared to well-established and state-of-the-art methods. Results on real images demonstrate that the proposed method is able to preserve accurately edges and structural details of the image.

547 citations


Book ChapterDOI
24 Jul 2009
TL;DR: This work proposes an improvement variant of the original duality based TV-L 1 optical flow algorithm that can preserve discontinuities in the flow field by employing total variation (TV) regularization and integrates a median filter into the numerical scheme to further increase the robustness to sampling artefacts in the image data.
Abstract: A look at the Middlebury optical flow benchmark [5] reveals that nowadays variational methods yield the most accurate optical flow fields between two image frames. In this work we propose an improvement variant of the original duality based TV-L 1 optical flow algorithm in [31] and provide implementation details. This formulation can preserve discontinuities in the flow field by employing total variation (TV) regularization. Furthermore, it offers robustness against outliers by applying the robust L 1 norm in the data fidelity term. Our contributions are as follows. First, we propose to perform a structure-texture decomposition of the input images to get rid of violations in the optical flow constraint due to illumination changes. Second, we propose to integrate a median filter into the numerical scheme to further increase the robustness to sampling artefacts in the image data. We experimentally show that very precise and robust estimation of optical flow can be achieved with a variational approach in real-time. The numerical scheme and the implementation are described in a detailed way, which enables reimplementation of this high-end method.

455 citations


Proceedings ArticleDOI
20 Jun 2009
TL;DR: A new bilateral filtering algorithm with computational complexity invariant to filter kernel size, so-called O(1) or constant time in the literature, that yields a new class of constant time bilateral filters that can have arbitrary spatial and arbitrary range kernels.
Abstract: We propose a new bilateral filtering algorithm with computational complexity invariant to filter kernel size, so-called O(1) or constant time in the literature. By showing that a bilateral filter can be decomposed into a number of constant time spatial filters, our method yields a new class of constant time bilateral filters that can have arbitrary spatial and arbitrary range kernels. In contrast, the current available constant time algorithm requires the use of specific spatial or specific range kernels. Also, our algorithm lends itself to a parallel implementation leading to the first real-time O(1) algorithm that we know of. Meanwhile, our algorithm yields higher quality results since we are effectively quantizing the range function instead of quantizing both the range function and the input image. Empirical experiments show that our algorithm not only gives higher PSNR, but is about 10× faster than the state-of-the-art. It also has a small memory footprint, needed only 2% of the memory required by the state-of-the-art for obtaining the same quality as exact using 8-bit images. We also show that our algorithm can be easily extended for O(1) median filtering. Our bilateral filtering algorithm was tested in a number of applications, including HD video conferencing, video abstraction, highlight removal, and multi-focus imaging.

325 citations


Journal ArticleDOI
TL;DR: It is shown that median filtering and linear filtering have similar asymptotic worst-case mean-squared error when the signal-to-noise ratio (SNR) is of order 1, which corresponds to the case of constant per-pixel noise level in a digital signal.
Abstract: Image processing researchers commonly assert that "median filtering is better than linear filtering for removing noise in the presence of edges." Using a straightforward large-n decision-theory framework, this folk-theorem is seen to be false in general. We show that median filtering and linear filtering have similar asymptotic worst-case mean-squared error (MSE) when the signal-to-noise ratio (SNR) is of order 1, which corresponds to the case of constant per-pixel noise level in a digital signal. To see dramatic benefits of median smoothing in an asymptotic setting, the per-pixel noise level should tend to zero (i.e., SNR should grow very large). We show that a two-stage median filtering using two very different window widths can dramatically outperform traditional linear and median filtering in settings where the underlying object has edges. In this two-stage procedure, the first pass, at a fine scale, aims at increasing the SNR. The second pass, at a coarser scale, correctly exploits the nonlinearity of the median. Image processing methods based on nonlinear partial differential equations (PDEs) are often said to improve on linear filtering in the presence of edges. Such methods seem difficult to analyze rigorously in a decision-theoretic framework. A popular example is mean curvature motion (MCM), which is formally a kind of iterated median filtering. Our results on iterated median filtering suggest that some PDE-based methods are candidates to rigorously outperform linear filtering in an asymptotic framework.

232 citations


Journal ArticleDOI
TL;DR: A new filtering method to remove Rician noise from magnetic resonance images is presented that relies on a robust estimation of the standard deviation of the noise and combines local linear minimum mean square error filters and partial differential equations for MRI, as the speckle reducing anisotropic diffusion did for ultrasound images.
Abstract: A new filtering method to remove Rician noise from magnetic resonance images is presented. This filter relies on a robust estimation of the standard deviation of the noise and combines local linear minimum mean square error filters and partial differential equations for MRI, as the speckle reducing anisotropic diffusion did for ultrasound images. The parameters of the filter are automatically chosen from the estimated noise. This property improves the convergence rate of the diffusion while preserving contours, leading to more robust and intuitive filtering. The partial derivative equation of the filter is extended to a new matrix diffusion filter which allows a coherent diffusion based on the local structure of the image and on the corresponding oriented local standard deviations. This new filter combines volumetric, planar, and linear components of the local image structure. The numerical scheme is explained and visual and quantitative results on simulated and real data sets are presented. In the experiments, the new filter leads to the best results.

210 citations


Journal ArticleDOI
TL;DR: A two-stage algorithm, called switching-based adaptive weighted mean filter, is proposed to remove salt-and-pepper noise from the corrupted images by replacing each noisy pixel with the weighted mean of its noise-free neighbors in the filtering window.
Abstract: A two-stage algorithm, called switching-based adaptive weighted mean filter, is proposed to remove salt-and-pepper noise from the corrupted images. First, the directional difference based noise detector is used to identify the noisy pixels by comparing the minimum absolute value of four mean differences between the current pixel and its neighbors in four directional windows with a predefined threshold. Then, the adaptive weighted mean filter is adopted to remove the detected impulses by replacing each noisy pixel with the weighted mean of its noise-free neighbors in the filtering window. Numerous simulations demonstrate that the proposed filter outperforms many other existing algorithms in terms of effectiveness in noise detection, image restoration and computational efficiency.

183 citations


Journal ArticleDOI
TL;DR: A new method for reducing random, spike-like noise in seismic data based on a 1D stationary median filter — the 1D time-varying median filter (TVMF), which strikes a balance between eliminating random noise and protecting useful information.
Abstract: Random noise in seismic data affects the signal-to-noise ratio, obscures details, and complicates identification of useful information. We have developed a new method for reducing random, spike-like noise in seismic data. The method is based on a 1D stationary median filter (MF) — the 1D time-varying median filter (TVMF). We design a threshold value that controls the filter window according to characteristics of signal and random, spike-like noise. In view of the relationship between seismic data and the threshold value, we chose median filters with different time-varying filter windows to eliminate random, spike-like noise. When comparing our method with other common methods, e.g., the band-pass filter and stationary MF, we found that the TVMF strikes a balance between eliminating random noise and protecting useful information. We tested the feasibility of our method in reducing seismic random, spike-like noise, on a synthetic dataset. Results of applying the method to seismic land data from Texas demons...

96 citations


Proceedings ArticleDOI
24 Sep 2009
TL;DR: The basic principle of ICA is introduced and the capabilities of sparse coding shrinkage is investigated in the field of image denoising and SCS is seen that SCS outperforms basicDenoising methods such as wiener filtering, median filtering and Independent Component Analysis applied to image Denoising.
Abstract: Sparse coding is a method for finding a neural network representation of multidimensional data in which each of the components of the representation is rarely ignorantly active at the same time. The representation is closely related to independent component analysis (ICA). In this paper, we introduced the basic principle of ICA and have investigated the capabilities of sparse coding shrinkage in the field of image denoising. We have also performed practical implementation of sparse code shrinkage (SCS) and applied to the image denoising. We have seen that SCS outperforms basic denoising methods such as wiener filtering, median filtering and Independent Component Analysis (ICA) applied to image denoising.

89 citations


Journal ArticleDOI
TL;DR: This study shows on imaging mass spectrometry (IMS) data that the Random Forest classifier can be used for automated tissue classification and that it results in predictions with high sensitivities and positive predictive values, even when intersample variability is present in the data.
Abstract: We show on imaging mass spectrometry (IMS) data that the Random Forest classifier can be used for automated tissue classification and that it results in predictions with high sensitivities and positive predictive values, even when intersample variability is present in the data We further demonstrate how Markov Random Fields and vector-valued median filtering can be applied to reduce noise effects to further improve the classification results in a posthoc smoothing step Our study gives clear evidence that digital staining by means of IMS constitutes a promising complement to chemical staining techniques

87 citations


Journal ArticleDOI
TL;DR: This paper proposes a number of new methods based on the application of Taylor series and cubic spline interpolation for color filter array demosaicking that can faithfully reproduced with minimal amount of color artifacts even at edges.
Abstract: Demosaicking is an estimation process to determine missing color values when a single-sensor digital camera is used for color image capture. In this paper, we propose a number of new methods based on the application of Taylor series and cubic spline interpolation for color filter array demosaicking. To avoid the blurring of an edge, interpolants are first estimated in four opposite directions so that no interpolation is carried out across an edge. A weighted median filter, whose filter coefficients are determined by a classifier based on an edge orientation map, is then used to produce an output from the four interpolants to preserve edges. Using the proposed methods, the original color can be faithfully reproduced with minimal amount of color artifacts even at edges.

76 citations


Journal ArticleDOI
TL;DR: A new impulse detection and filtering algorithm is proposed for restoration of images that are highly corrupted by impulse noise based on the minimum absolute value of four convolutions obtained by one-dimensional Laplacian operators.

Patent
18 Aug 2009
TL;DR: In this paper, a method for correcting an image from defects and filtering from Gaussian noise was proposed, which corrected each pixel of the image when it was considered defective and filtered it from Gaussian noise in one pass.
Abstract: A method for correcting an image from defects and filtering from Gaussian noise corrects each pixel of the image when it is considered defective and filters it from Gaussian noise in one-pass. The one-pass improves the speed for performing the correcting and filtering. The drawbacks associated with choosing incompatible defect correction and filtering operations are overcome.

Journal ArticleDOI
TL;DR: This post-processing method reduced ring artifacts in the reconstructed images and improved image quality for phantom and in in vivo scans.
Abstract: In high-resolution micro CT using flat detectors (FD), imperfect or defect detector elements may cause concentric-ring artifacts due to their continuous over- or underestimation of attenuation values, which often disturb image quality. We here present a dedicated image-based ring artifact correction method for high-resolution micro CT, based on median filtering of the reconstructed image and working on a transformed version of the reconstructed images in polar coordinates. This post-processing method reduced ring artifacts in the reconstructed images and improved image quality for phantom and in in vivo scans. Noise and artifacts were reduced both in transversal and in multi-planar reformations along the longitudinal axis.

Journal ArticleDOI
TL;DR: A novel fuzzy reasoning-based directional median filter is proposed to remove the random-value impulse noise efficiently and outperforms several existing filter schemes for impulse noise removal in an image.

Journal ArticleDOI
TL;DR: A novel nonlinear adaptive spatial filter (median‐modified Wiener filter, MMWF), is compared with five well‐established denoising techniques to suggest, by means of fuzzy sets evaluation, the bestDenoising approach to use in practice.
Abstract: Denoising is a fundamental early stage in 2-DE image analysis strongly influencing spot detection or pixel-based methods. A novel nonlinear adaptive spatial filter (median-modified Wiener filter, MMWF), is here compared with five well-established denoising techniques (Median, Wiener, Gaussian, and Polynomial-Savitzky-Golay filters; wavelet denoising) to suggest, by means of fuzzy sets evaluation, the best denoising approach to use in practice. Although median filter and wavelet achieved the best performance in spike and Gaussian denoising respectively, they are unsuitable for contemporary removal of different types of noise, because their best setting is noise-dependent. Vice versa, MMWF that arrived second in each single denoising category, was evaluated as the best filter for global denoising, being its best setting invariant of the type of noise. In addition, median filter eroded the edge of isolated spots and filled the space between close-set spots, whereas MMWF because of a novel filter effect (drop-off-effect) does not suffer from erosion problem, preserves the morphology of close-set spots, and avoids spot and spike fuzzyfication, an aberration encountered for Wiener filter. In our tests, MMWF was assessed as the best choice when the goal is to minimize spot edge aberrations while removing spike and Gaussian noise.


Journal ArticleDOI
TL;DR: In this paper, an adaptive filtering approach is proposed to restore images corrupted by salt and pepper noise, where the noise is attenuated by estimating the values of the noisy pixels with a switching based median filter applied exclusively to those neighborhood pixels not labeled as noisy.
Abstract: An Impulse noise detection and removal with adaptive filtering approach is proposed to restore images corrupted by salt & pepper noise. The proposed algorithm works well for suppressing impulse noise with noise density from 5 to 60% while preserving image details. The difference of current central pixel with median of local neighborhood pixels is used to classify the central pixel as noisy or noise-free. The noise is attenuated by estimating the values of the noisy pixels with a switching based median filter applied exclusively to those neighborhood pixels not labeled as noisy. The size of filtering window is adaptive in nature, and it depends on the number of noise-free pixels in current filtering window. Simulation results indicate that this filter is better able to preserve 2-D edge structures of the image and delivers better performance with less computational complexity as compared to other denoising algorithms existing in literature.

Proceedings Article
04 Jun 2009
TL;DR: In this paper, an efficient non-linear cascade filter for the removal of high density salt and pepper noise in image and video is proposed, which consists of two stages to enhance the filtering.
Abstract: In this paper, an efficient non-linear cascade filter for the removal of high density salt and pepper noise in image and video is proposed. The proposed method consists of two stages to enhance the filtering. The first stage is the Decision based Median Filter (DMF) which is used to identify pixels likely to be contaminated by salt and pepper noise and replaces them by the median value. The second stage is the Unsymmetric Trimmed Filter, either Mean Filter (UTMF) or Midpoint Filter (UTMP) which is used to trim the noisy pixels in an unsymmetric manner and processes with the remaining pixels The basic idea is that, though the level of denoising in the first stage is lesser at high noise densities, the second stage helps to increase the noise suppression. Hence, the proposed cascaded filter, as a whole is very suitable for low, medium as well as high noise densities even above 90%. The existing non-linear filters such as Standard Median Filter (SMF), Adaptive Median Filter (AMF), Weighted Median Filter (WMF), Recursive Weighted Median Filter (RWM) performs well only for low and medium noise densities. The recently proposed Decision Based Algorithm (DBA) shows better results upto 70% noise density and at high noise densities, the restored image quality is poor. The proposed algorithm shows better image and video quality in terms of visual appearance and quantitative measures.

Journal ArticleDOI
TL;DR: A three-dimensional measuring technique is proposed, which is used to survey components surface roughness, based on the digital image processing technology, and an athree-dimensional surface Roughness evaluation system is established consisting of both hardware and software architecture.
Abstract: With the development and application of optoelectric technology, laser technology and computer technology in mind, we have developed a method to evaluate three-dimensional surface roughness using surface profile information. This article proposes a three-dimensional measuring technique, which is used to survey components surface roughness, based on the digital image processing technology, and establishes athree-dimensional surface roughness evaluation system consisting of both hardware and software architecture. The hardware used in the present experiment is listed as follows: a stereomicroscope, a digital camera with special interface, a parallel light (by halogen lamp), an X, Y bidirectional laboratory bench and a computer. A computer-aided system (CAS), developed with Visual C++ , is used in the image pretreatment and data processing analysis. In the experiment, image information gathered from the digital camera is pre-processed by median filtering, grayscale equalization and histogram conversion amplification. Then the data are analyzed by normalized cross-correlation and surface fitting techniques. Lastly, the correlation between m, σ, Sq, S ku and Ra is discussed for different surface roughness specimens.


Proceedings ArticleDOI
01 Dec 2009
TL;DR: An image noise filter based on cellular automata (CA), which can remove impulse noise from a noise corrupted image is presented and is compared with the classical median filter and different switching filters in terms of peak signal to noise ratio.
Abstract: This paper presents an image noise filter based on cellular automata (CA), which can remove impulse noise from a noise corrupted image. Uniform cellular automata rules are constructed to filter impulse noise from both binary and gray scale images. Several modifications to the standard CA formulation are then applied to improve the filtering performance. For example, a random CA rule solves the noise propagation present in deterministic CA filters. A mirrored CA is used to solve the fixed boundary problem. The performance of this CA approach is compared with the classical median filter and different switching filters in terms of peak signal to noise ratio. This comparison shows that a filter based on cellular automata provides significant improvements over the standard filtering methods.

Journal ArticleDOI
TL;DR: The proposed impulse noise detector is established based on the rank order arrangement of the pixels in the sliding window and overcomes the above weakness such that the switching median filter is much effective in impulse noise removal.
Abstract: In this paper, the switching median filter is modified by adding one more noise detector to improve the capability of impulse noise removal. The proposed impulse noise detector is established based on the rank order arrangement of the pixels in the sliding window. The original switching median filter cannot detect the noise pixel whose value is close to its neighbors if the threshold is designed for emphasizing the detail preservation. Therefore, it is hard to recognize a noise-like pixel as a noise or a noise-free pixel in the sliding window. The proposed impulse noise detector overcomes the above weakness such that the switching median filter is much effective in impulse noise removal.

Proceedings ArticleDOI
22 Jan 2009
TL;DR: The effect of two filters—Simple filter and Median filter are compared and median filter is chosen to wipe out the disturbance of noise effectively, and two-apex method was applied to separate the disease images from the background.
Abstract: Regarding the cucumber powdery mildew, speckle and downy mildew as examples, the method of image pre-processing for recognizing crop diseases was studied. This paper compared the effect of two filters—Simple filter and Median filter, and at last we chose median filter to wipe out the disturbance of noise effectively, and two-apex method was applied to separate the disease images from the background. Disease spots were separated through performing image edge detection and Snake model, and the latter got more desired result. Thus the image pre-processing made a good foundation for following effective characteristic parameters for the disease diagnoses and setting up pattern recognition system.

Journal ArticleDOI
TL;DR: A set of architectures for computing both the median and weighted median of large, flexibly sized windows through parallel cumulative histogram construction is presented.
Abstract: Most effort in designing median filters has focused on two-dimensional filters with small window sizes, used for image processing. However, recent work on novel image processing algorithms, such as the Trace transform, has highlighted the need for architectures that can compute the median and weighted median of large one-dimensional windows, to which the optimisations in the aforementioned architectures do not apply. A set of architectures for computing both the median and weighted median of large, flexibly sized windows through parallel cumulative histogram construction is presented. The architecture uses embedded memories to control the highly parallel bank of histogram nodes, and can implicitly determine window sizes for median and weighted median calculations. The architecture is shown to perform at 72 Msamples, and has been integrated within a Trace transform architecture.

Journal ArticleDOI
TL;DR: In this paper, QR bar code and image processing techniques are used to construct a nested steganography scheme that can conceal lossless and lossy secret data into a cover image simultaneously and is robust to JPEG attacks.
Abstract: In this paper, QR bar code and image processing techniques are used to construct a nested steganography scheme. There are two types of secret data lossless and lossy embedded into a cover image. The lossless data is text that is first encoded by the QR barcode; its data does not have any distortion when comparing with the extracted data and original data. The lossy data is a kind of image; the face image is suitable for our case. Because the extracted text is lossless, the error correction rate of QR encoding must be carefully designed. We found a 25% error correction rate is suitable for our goal. In image embedding, because it can sustain minor perceptible distortion, we thus adopted the lower nibble byte discard of the face image to reduce the secret data. When the image is extracted, we use a median filter to filter out the noise and obtain a smoother image quality. After simulation, it is evident that our scheme is robust to JPEG attacks. Compared to other steganogra- phy schemes, our proposed method has three advantages: i the nested scheme is an enhanced security system never previously developed; ii our scheme can conceal lossless and lossy secret data into a cover image simultaneously; and iii the QR barcode used as secret data can widely extend this method's application fields. © 2009 Society of Photo-Optical

Proceedings ArticleDOI
Yueli Hu1, Huijie Ji1
19 May 2009
TL;DR: This paper gives account to the FPGA implementation of the complete structure of a general filter including the filtering window generating module and the row-column counting module.
Abstract: Image filtering plays an important role in image preprocessing. The median filters, including the standard median filter and the multi-level median filter, which can preserve image features and thin lines are introduced and discussed in detail. After that, the FPGA based solution for the algorithms of these two filters are presented. This paper also gives account to the FPGA implementation of the complete structure of a general filter including the filtering window generating module and the row-column counting module. RTL level simulation is performed in Modelsim to verify the functional correctness and system level simulation is performed in Matlab to compare the filtering effect.

Patent
Davis Y. Pan1
20 Mar 2009
TL;DR: In this article, an active noise reduction system that reduces the incidence of divergence in the presence of high amplitude interfering noise is proposed, where a limited frequency range threshold is established to limit the divergence.
Abstract: An active noise reduction system that reduces the incidence of divergence in the presence of high amplitude interfering noise. A limited frequency range threshold is established.

Journal ArticleDOI
TL;DR: A new filter to restore radiographic images corrupted by impulsive noise is proposed, based on a switching scheme where all the pulses are first detected and then corrected through a median filter, which is able to reliably estimate the sensor gain.
Abstract: A new filter to restore radiographic images corrupted by impulsive noise is proposed. It is based on a switching scheme where all the pulses are first detected and then corrected through a median filter. The pulse detector is based on the hypothesis that the major contribution to image noise is given by the photon counting process, with some pixels corrupted by impulsive noise. Such statistics is described by an adequate mixture model. The filter is also able to reliably estimate the sensor gain. Its operation has been verified on both synthetic and real images; the experimental results demonstrate the superiority of the proposed approach in comparison with more traditional methods.

Proceedings ArticleDOI
15 Sep 2009
TL;DR: This paper presents a new architecture and circuit implementation of 1-D median filter that has linear hardware complexity, minimal latency and achieves throughput of 1/2 of the sampling rate.
Abstract: This paper presents a new architecture and circuit implementation of 1-D median filter. The proposed circuit belongs to the class of non-recursive sorting network architectures that process the input samples sequentially in the word-based manner. In comparison to the related schemes, it maintains sorting of samples from the previous position of the sliding window, positioning only the incoming sample to the correct rank. Unlike existing 1-D filter implementations, the circuit has linear hardware complexity, minimal latency and achieves throughput of 1/2 of the sampling rate. Experimental evaluation and comparisons show high efficiency of our design.

Journal ArticleDOI
TL;DR: This paper describes the use of a graphics processing unit for implementing signal-processing algorithms for color Doppler ultrasound that achieves a frame rate of 160 fps for frames comprising 500 scan lines times 128 range samples.
Abstract: Color Doppler ultrasound is a routinely used diagnostic tool for assessing blood flow information in real time. The required signal processing is computationally intensive, involving autocorrelation, linear filtering, median filtering, and thresholding. Because of the large amount of data and high computational requirement, color Doppler signal processing has been mainly implemented on custom-designed hardware, with software-based implementation - particularly on a general- purpose CPU - not being successful. In this paper, we describe the use of a graphics processing unit for implementing signal-processing algorithms for color Doppler ultrasound that achieves a frame rate of 160 fps for frames comprising 500 scan lines times 128 range samples, with each scan line being obtained from an ensemble size of 8 with an 8-tap FIR clutter filter.