scispace - formally typeset
Search or ask a question

Showing papers on "Median filter published in 2003"


Journal ArticleDOI
TL;DR: A robust wavelet domain method for noise filtering in medical images that adapts itself to various types of image noise as well as to the preference of the medical expert; a single parameter can be used to balance the preservation of (expert-dependent) relevant details against the degree of noise reduction.
Abstract: We propose a robust wavelet domain method for noise filtering in medical images. The proposed method adapts itself to various types of image noise as well as to the preference of the medical expert; a single parameter can be used to balance the preservation of (expert-dependent) relevant details against the degree of noise reduction. The algorithm exploits generally valid knowledge about the correlation of significant image features across the resolution scales to perform a preliminary coefficient classification. This preliminary coefficient classification is used to empirically estimate the statistical distributions of the coefficients that represent useful image features on the one hand and mainly noise on the other. The adaptation to the spatial context in the image is achieved by using a wavelet domain indicator of the local spatial activity. The proposed method is of low complexity, both in its implementation and execution time. The results demonstrate its usefulness for noise suppression in medical ultrasound and magnetic resonance imaging. In these applications, the proposed method clearly outperforms single-resolution spatially adaptive algorithms, in terms of quantitative performance measures as well as in terms of visual quality of the images.

540 citations


Journal ArticleDOI
TL;DR: A new fuzzy filter is presented for the noise reduction of images corrupted with additive noise, based on fuzzy rules which make use of membership functions.
Abstract: A new fuzzy filter is presented for the noise reduction of images corrupted with additive noise. The filter consists of two stages. The first stage computes a fuzzy derivative for eight different directions. The second stage uses these fuzzy derivatives to perform fuzzy smoothing by weighting the contributions of neighboring pixel values. Both stages are based on fuzzy rules which make use of membership functions. The filter can be applied iteratively to effectively reduce heavy noise. In particular, the shape of the membership functions is adapted according to the remaining noise level after each iteration, making use of the distribution of the homogeneity in the image. A statistical model for the noise distribution can be incorporated to relate the homogeneity to the adaptation scheme of the membership functions. Experimental results are obtained to show the feasibility of the proposed approach. These results are also compared to other filters by numerical measures and visual inspection.

314 citations


Journal ArticleDOI
TL;DR: A decision-based, signal-adaptive median filtering algorithm for removal of impulse noise, which achieves accurate noise detection and high SNR measures without smearing the fine details and edges in the image.
Abstract: We propose a decision-based, signal-adaptive median filtering algorithm for removal of impulse noise. Our algorithm achieves accurate noise detection and high SNR measures without smearing the fine details and edges in the image. The notion of homogeneity level is defined for pixel values based on their global and local statistical properties. The cooccurrence matrix technique is used to represent the correlations between a pixel and its neighbors, and to derive the upper and lower bound of the homogeneity level. Noise detection is performed at two stages: noise candidates are first selected using the homogeneity level, and then a refining process follows to eliminate false detections. The noise detection scheme does not use a quantitative decision measure, but uses qualitative structural information, and it is not subject to burdensome computations for optimization of the threshold values. Empirical results indicate that our scheme performs significantly better than other median filters, in terms of noise suppression and detail preservation.

290 citations


Journal ArticleDOI
Sarp Erturk1
TL;DR: This paper presents digital image stabilization with sub-image phase correlation based global motion estimation and Kalman filtering based motion correction and Kal man filtered for stabilization.
Abstract: This paper presents digital image stabilization with sub-image phase correlation based global motion estimation and Kalman filtering based motion correction. Global motion is estimated from the local motions of four sub-images each of which is detected using phase correlation based motion estimation. The global motion vector is decided according to the peak values of sub-image phase correlation surfaces, instead of impartial median filtering. The peak values of sub-image phase correlation surfaces reveal reliable local motion vectors, as poorly matched sub images result in considerably lower peaks in the phase correlation surface due to spread. The utilization of sub-images enables fast implementation of phase correlation based motion estimation. The global motion vectors of image frames are accumulated to obtain global displacement vectors, that are Kalman filtered for stabilization.

235 citations


Journal ArticleDOI
TL;DR: A new adaptive vector median filtering scheme taking the advantage of the optimal filtering situation and the robust order-statistic theory, is provided, based on the set of vector-valued order-Statistics with the smallest distances to other samples in the input set.

228 citations


Journal ArticleDOI
15 Oct 2003
TL;DR: In this article, Chang et al. introduced a novel speckle reduction method based on soft thresholding the wavelet coefficients of the logarithmically transformed medical ultrasound image, which is based on the generalized Gaussian distributed (GGD) modeling of subband coefficients.
Abstract: The paper introduces a novel speckle reduction method based on soft thresholding the wavelet coefficients of the logarithmically transformed medical ultrasound image. The method is based on the generalized Gaussian distributed (GGD) modeling of subband coefficients. The proposed method is a variant of the recently published BayesShrink method (Chang, G et al., IEEE Trans. Image Processing, vol.9, no.9, p.1522-31, 2000) derived in the Bayesian framework for denoising natural images. It is scale adaptive because the parameters required for estimating the threshold depend on scale and subband data. The threshold is computed by K/spl sigma//sup 2///spl sigma//sub x/ where /spl sigma/ and /spl sigma//sub x/ are the standard deviation of the noise and the subband data of the noise-free image, respectively, and K is a scale parameter. Experimental results show that the proposed method performs better than the median filter as well as the homomorphic Wiener filter, especially in terms of feature preservation for better diagnosis as desired in medical image processing.

218 citations


Journal ArticleDOI
TL;DR: In this paper, a novel regularization technique which can combine signals from all Global Positioning System (GPS) satellites for a given instant and a given receiver is developed to estimate the vertical total electron content (VTEC) values for the 24-hour period without missing any important features in the temporal domain.
Abstract: [1] A novel regularization technique which can combine signals from all Global Positioning System (GPS) satellites for a given instant and a given receiver is developed to estimate the vertical total electron content (VTEC) values for the 24-hour period without missing any important features in the temporal domain. The algorithm is based on the minimization of a cost function which also includes a high pass penalty filter. Optional weighting function and sliding window median filter are added to enrich the processing and smoothing of the data. The developed regularized estimation algorithm is applied to GPS data for various locations for the solar maximum week of 23–28 April 2001. The parameter set that is required by the estimation algorithm is chosen optimally using appropriate error functions. This robust and optimum parameter set can be used for all latitudes and for both quiet and disturbed days. It is observed that the estimated TEC values are in general accordance with the TEC estimates from other global ionospheric maps, especially for quiet days and midlatitudes. Owing to its 30 s time resolution, the regularized VTEC estimates from the developed algorithm are very successful in representation and tracking of sudden temporal variations of the ionosphere, especially for high latitudes and during ionospheric disturbances.

160 citations


Book
01 Jan 2003
TL;DR: This chapter discusses the design and implementation of Filter Design and Implementation for Multivariate Signal Processing, as well as some of the techniques used in Image Processing Fundamentals.
Abstract: Chapter 1. Fundamental Concepts.Chapter 2. Fourier Analysis.Chapter 3. Z-Transform and Digital Filters.Chapter 4. Filter Design and Implementation.Chapter 5. Multivariate Signal Processing.Chapter 6. Finite-Wordlength Effects.Chapter 7. Adaptive Signal Processing.Chapter 8. Least-Squares Adaptive Algorithms.Chapter 9. Linear Prediction.Chapter 10. Image Processing Fundamentals.Chapter 11. Image Compression and Coding.Appendix. Concepts of Linear Algebra.Index.

156 citations


Journal ArticleDOI
TL;DR: Methods and issues involved in the compression of CFA data before full color interpretation are discussed, which operate on the same number of pixels as the sensor data.
Abstract: Many consumer digital color cameras use a single light sensitive sensor and a color filter array (CFA) with each pixel element recording intensity information of one color component. The captured data is interpolated into a full color image, which is then compressed in many applications. Carrying out color interpolation before compression introduces redundancy in the data. In this paper we discuss methods and issues involved in the compression of CFA data before full color interpretation. The compression methods described operate on the same number of pixels as the sensor data. To obtain improved image quality, median filtering is applied as post-processing. Furthermore, to assure low complexity, the CFA data is compressed by JPEG. Simulations have demonstrated that substantial improvement in image quality is achievable using these new schemes.

154 citations


Journal ArticleDOI
TL;DR: Results of processed images show that the method proposed reduces speckle noise and preserves edge details effectively and is compared to two other methods––the adaptive weighted median filter and the homogeneous region growing mean filter.

128 citations


Journal ArticleDOI
TL;DR: A new class of filters for noise attenuation is introduced and its relationship with commonly used filtering techniques is investigated and it is indicated that the new filter outperforms the VMF, as well as other techniques currently used to eliminate impulsive noise in color images.
Abstract: In this paper, we address the problem of impulsive noise reduction in multichannel images. A new class of filters for noise attenuation is introduced and its relationship with commonly used filtering techniques is investigated. The computational complexity of the new filter is lower than that of the vector median filter (VMF). Extensive simulation experiments indicate that the new filter outperforms the VMF, as well as other techniques currently used to eliminate impulsive noise in color images.

Journal ArticleDOI
TL;DR: The findings indicate that the developed modification routines provide a good means of simulating the resolution and noise characteristics of digital radiographic systems for optimization or processing purposes.
Abstract: A new computer simulation approach is presented that is capable of modeling several varieties of digital radiographic systems by their image quality characteristics. In this approach, the resolution and noise characteristics of ideal supersampled input images are modified according to input modulation transfer functions (MTFs) and noise power spectra (NPS). The modification process is separated into two routines-one for modification of the resolution and another for modification of the noise characteristics of the input image. The resolution modification routine blurs the input image by applying a frequency filter described by the input MTF. The resulting blurred image is then reduced to its final size to account for the sampling process of the digital system. The noise modification routine creates colored noise by filtering the frequency components of a white noise spectrum according to the input noise power. This noise is then applied to the image by a moving region of interest to account for variations in noise due to differences in attenuation. In order to evaluate the efficacy of the modification routines, additional routines were developed to assess the resolution and noise of digital images. The MTFs measured from the output images of the resolution modification routine were within 3% of the input MTF The NPS measured from the output images of the noise modification routine were within 2% of the input NPS. The findings indicate that the developed modification routines provide a good means of simulating the resolution and noise characteristics of digital radiographic systems for optimization or processing purposes.

Journal ArticleDOI
TL;DR: The L-estimation based signal transforms and time-frequency representations are introduced by considering the corresponding minimization problems in the Huber (1981, 1998) estimation theory to produce robust estimates of the non-noisy signal transforms.
Abstract: The L-estimation based signal transforms and time-frequency (TF) representations are introduced by considering the corresponding minimization problems in the Huber (1981, 1998) estimation theory. The standard signal transforms follow as the maximum likelihood solutions for the Gaussian additive noise environment. For signals corrupted by an impulse noise, the median-based transforms produce robust estimates of the non-noisy signal transforms. When the input noise is a mixture of Gaussian and impulse noise, the L-estimation-based signal transforms can outperform other estimates. In quadratic and higher order TF analysis, the resulting noise is inherently a mixture of the Gaussian input noise and an impulse noise component. In this case, the L-estimation-based signal representations can produce the best results. These transforms and TF representations give the standard and the median-based forms as special cases. A procedure for parameter selection in the L-estimation is proposed. The theory is illustrated and checked numerically.

Journal ArticleDOI
TL;DR: D dose resolution can be significantly improved in CT PAG dosimetry through postprocessing of CT images using spatial noise reduction filters, but such filters are not equal in their ability to improve dose resolution or to maintain the spatial integrity of the dose distribution and an appropriate filter must be chosen depending on clinical demands of the application.
Abstract: X-ray computed tomography(CT) has been established as a feasible method of performing dosimetry using polyacrylamide gels (PAGs). A small density change occurs in PAG upon irradiation that provides contrast in PAG CTimages. However, low dose resolution limits the clinical usefulness of the technique. This work investigates the potential of using image filtering techniques on PAG CTimages in order to reduce imagenoise and improve dose resolution. CTimagenoise for the scanner and protocol used for the gelimages is analyzed and found to be Gaussian distributed and independent of the contrast level in the images. As a result, several filters for reducing spatially invariant noise are investigated: mean, median, midpoint, adaptive mean, alpha-trimmed mean, sigma mean, and a relatively new filter called SUSAN (smallest univalue segment assimilating nucleus). All filters are applied, using 3×3, 5×5, and 7×7 pixel masks, to a CTimage of a PAG irradiated with a stereotactic radiosurgery dose distribution. The dose resolution within 95% confidence (D Δ 95% ) is calculated and compared for each filtered image, as well the unfiltered image. In addition, the ability of the filters to maintain the spatial integrity of the dose distribution is evaluated and compared. Results clearly indicate that the filters are not equal in their ability to improve D Δ 95% or in their effect on the spatial integrity of the dose distribution. In general, increasing mask size improves D Δ 95% but simultaneously degrades spatial dose information. The mean filter provides the greatest improvement in D Δ 95% , but also the greatest loss of spatial dose information. The SUSAN, mean adaptive, and alpha-trimmed mean filters all provide comparable, but slightly poorer dose resolution. In addition, the SUSAN and adaptive filters both excel at maintaining the spatial distribution of dose and overall are the best performing filters for this application. The midpoint filter, normally useful for Gaussian noise, is poor all-round, dramatically distorting the dose distribution for masks greater than 3×3. The median filter, a common edge preserving noise reduction filter, performs moderately well, but artificially increases high dose gradients. The sigma filter preserves the spatial distribution of dose very well but is least effective at improving dose resolution. In summary, dose resolution can be significantly improved in CT PAG dosimetry through postprocessing of CTimages using spatialnoise reduction filters. However, such filters are not equal in their ability to improve dose resolution or to maintain the spatial integrity of the dose distribution and an appropriate filter must be chosen depending on clinical demands of the application.

Journal ArticleDOI
TL;DR: A new median filter termed as the iterative center weighted median filter (ICWMF) in the wavelet coefficient domain is proposed for image denoising, which iteratively smoothes the noisy wavelet coefficients' variances preserving the edge information contained in the large magnitudeWavelet coefficients.

Journal Article
TL;DR: In this paper, a novel method of noise reduction in color images is presented, which is capable of attenuating both impulsive and Gaussian noise, while preserving and even enhancing the sharpness of the image edges.
Abstract: In this paper a novel method of noise reduction in color images is presented. The new technique is capable of attenuating both impulsive and Gaussian noise, while preserving and even enhancing the sharpness of the image edges. Extensive simulations reveal that the new method outperforms significantly the standard techniques widely used in multivariate signal processing. In this work we apply the new noise reduction method for the enhancement of the images of the so called gene chips. We demonstrate that the new technique is capable of reducing the impulsive noise present in microarray images and that it facilitates efficient spot location and the estimation of the gene expression levels due to the smoothing effect and preservation of the spot edges. This paper contains a comparison of the new technique of impulsive noise reduction with the standard procedures used for the processing of vector valued images, as well as examples of the efficiency of the new algorithm when applied to typical microarray images.

Book ChapterDOI
10 Sep 2003
TL;DR: In this paper, a median filter for tensor-valued data is proposed, which inherits a number of favorable properties from scalar-valued median filtering, and is applied to diffusion tensor magnetic resonance imaging.
Abstract: Novel matrix-valued imaging techniques such as diffusion tensor magnetic resonance imaging require the development of edge-preserving nonlinear filters. In this paper we introduce a median filter for such tensor-valued data. We show that it inherits a number of favourable properties from scalar-valued median filtering, and we present experiments on synthetic as well as on real-world images that illustrate its performance.

Journal ArticleDOI
TL;DR: Ground-breaking multi-scale frequency processing algorithms, based on the subdivision of an image in multiple frequency bands according to its structural composition, allow for a wide range of image manipulations including a size-independent enhancement of low-contrast structures.
Abstract: Summary:Image processing has a major impact on image quality and diagnostic performance of digital chest radiographs. Goals of processing are to reduce the dynamic range of the image data to capture the full range of attenuation differences between lungs and mediastinum, to improve the modulation tr

Journal ArticleDOI
TL;DR: A novel vector filtering technique for color image restoration that incorporates a new fuzzy inference system for noise detection that is combined with a switching scheme to select between an identity filter output and the output from a proposed L-filter design.
Abstract: We present a novel vector filtering technique for color image restoration that incorporates a new fuzzy inference system for noise detection. This is combined with a switching scheme to select between an identity filter output and the output from a proposed L-filter design. The proposed L-filter is designed to exploit the ordering techniques of the vector median filters, and thus it requires only a set of two coefficients. These coefficients are trained using a constrained least-mean squares approach, which is capable of converging to the optimum set within a short period of time. The new algorithm treats the intensity and color of each pixel individually until the final output is to be calculated, thus, the optimal magnitude and direction of the pixel vectors are used.

Book ChapterDOI
01 Jan 2003
TL;DR: Simulation results indicate that these seven fuzzy filters achieve varying successes in noise reduction in images as compared to the standard MED and MAV filters.
Abstract: In this chapter, seven fuzzy filters for noise reduction in images are introduced. These seven fuzzy filters include the Gaussian fuzzy filter with median center (GMED), the symmetrical triangular fuzzy filter with median center (TMED), the asymmetrical triangular fuzzy filter with median center (ATMED), the Gaussian fuzzy filter with moving average center (GMAV), the symmetrical triangular fuzzy filter with moving average center (TMAV), the asymmetrical triangular fuzzy filter with moving average center (ATMAV), and the decreasing weight fuzzy filter with moving average center (DWMAV). Each of these fuzzy filters, applies a weighted membership function to an image within a window to determine the center pixel, is easy and fast to implement. Simulation results on the filtering performance of these seven fuzzy filters and the standard median filter (MED) and moving average filter (MAV) on images contaminated with low, medium, high impulse and random noises are presented. Results indicate that these seven fuzzy filters achieve varying successes in noise reduction in images as compared to the standard MED and MAV filters.

Book ChapterDOI
TL;DR: The results show that from both a PSNR and a visual quality, the proposed filter outperforms the other state of the art filters for different image sequences.
Abstract: This paper presents a non-linear technique for noise reduction in video that is suitable for real-time processing The proposed algorithm automatically adapts to detected levels of detail and motion, but also to the noise level, provided it is short-tail noise, such as Gaussian noise It uses a one-level wavelet decomposition, and performs independent processing in four different bands in the wavelet domain The non-decimated transform is used because it leads to better results for image/video denoising than the decimated transform The results show that from both a PSNR and a visual quality, the proposed filter outperforms the other state of the art filters for different image sequences

Patent
Yoshihiro Nakami1
17 Dec 2003
TL;DR: In this article, a median filter with an angle matching the determined edge angle is selected, and the selected median filter is used to carry out the smoothing process on target pixels TP which are edge-forming pixels.
Abstract: An image processing apparatus determines whether or not a target pixel TP which is the object of the smoothing process is an edge-forming pixel based on the edge level (gradient g). If the pixel is determined to be an edge-forming pixel, the smoothing process employs an elliptical median filter instead of a moving average filter. The median filter MF has an elliptical reference area RA that is tilted (angled) to match the orientation of the edge Eg (edge angle), allowing the smoothing process to be carried out without compromising the edge components. The angle of the edge Eg is thus determined in the smoothing process, a median filter MF with an angle matching the determined edge angle is selected, and the selected median filter MF is used to carry out the smoothing process on target pixels TP which are edge-forming pixels.

Journal ArticleDOI
TL;DR: It is shown that, compared to existing techniques, the filters introduced here are better able to suppress impulsive, Gaussian as well as mixed-type noise.

Proceedings ArticleDOI
16 May 2003
TL;DR: A novel method for intra-frame image processing, which is applicable to a wide variety of medical imaging modalities, like X-ray angiography,X-ray fluoroscopy, magnetic resonance, or ultrasound, and allowing a real-time implementation on standard hardware is presented.
Abstract: We present a novel method for intra-frame image processing, which is applicable to a wide variety of medical imaging modalities, like X-ray angiography, X-ray fluoroscopy, magnetic resonance, or ultrasound. The method allows to reduce noise significantly - by about 4.5 dB and more - while preserving sharp image details. Moreover, selective amplification of image details is possible. The algorithm is based on a multi-resolution approach. Noise reduction is achieved by non-linear adaptive filtering of the individual band pass layers of the multi-resolution pyramid. The adaptivity is controlled by image gradients calculated from the next coarser layer of the multi-resolution pyramid representation, thus exploiting cross-scale dependencies. At sites with strong gradients, filtering is performed only perpendicular to the gradient, i.e. along edges or lines. The multi-resolution approach processes each detail on its appropriate scale so that also for low frequency noise small filter kernels are applied, thus limiting computational costs and allowing a real-time implementation on standard hardware. In addition, gradient norms are used to distinguish smoothly between “structure” and “noise only” areas, and to perform additional noise reduction and edge enhancement by selectively attenuating or amplifying the corresponding band pass coefficients.

Journal ArticleDOI
TL;DR: In this paper, a frequency-domain least-mean-square adaptive filter is used to cancel noise in a wheel speed sensor embedded in a car under performance tests, where the relevant signal is buried in a broadband noise background, where we have little or no prior knowledge of the signal or noise characteristics.
Abstract: In this paper, a frequency-domain least-mean-square adaptive filter is used to cancel noise in a wheel speed sensor embedded in a car under performance tests. In this case the relevant signal is buried in a broad-band noise background, where we have little or no prior knowledge of the signal or noise characteristics. The results of the experiments show that the signal of interest and the noise (all forms of interference, deterministic, as well as stochastic) share the same frequency band and that the filter used significantly reduced the noise corrupting the information from the sensor while it left the true signal unchanged from a practical point of view. In this paper, a signal-to-noise ratio improvement higher than 40 dB is achieved. The results of the experiment show the importance of using digital signal processing when dealing with a signal corrupted by noise.

01 Jan 2003
TL;DR: It is suggested that within the context of a youth-services agency such as this, using a mobile app to manage social media accounts is a good idea and may be beneficial to both the agency and the individual.
Abstract: ................................................................................. CHAPTER

Book ChapterDOI
01 Jan 2003
TL;DR: A new filter structure for the reduction of mixed noise in images is proposed, based on the evaluation of fuzzy similarities between pixels in a local processing window and is suitable for high-speed hardware implementation.
Abstract: We propose a new filter structure for the reduction of mixed noise in images. It is based on the evaluation of fuzzy similarities between pixels in a local processing window and is suitable for high-speed hardware implementation. The filter involves two tunable parameters and is fairly robust against changes in noise distribution. Furthermore, we outline a modular hardware architecture for general high-speed image processing tasks.

Journal ArticleDOI
TL;DR: Simulation results show that the proposed DCT-based detection algorithms accurately classifies smooth, edge, or nonsmooth blocks to help the adaptive postprocessor to effectively remove the blocky effect and sharply preserve the edge information.
Abstract: In this paper, an adaptive postprocessor figured with discrete cosine transform (DCT)-based block classification to effectively remove the so-called blocky effect from compressed video sequences is proposed. The proposed DCT-based detection algorithms for both intraframes and interframes require much lower computation complexity than the spatial-domain approaches. In order to preserve the edge information, the adaptive postprocessor is also designed with a DCT-based edge detection mechanism such that a one-dimensional median filter can be adaptively adjusted to match with the edge orientation. Simulation results show that the proposed DCT-based detection algorithms accurately classifies smooth, edge, or nonsmooth blocks to help the adaptive postprocessor to effectively remove the blocky effect and sharply preserve the edge information.

Proceedings ArticleDOI
14 Dec 2003
TL;DR: The results show that applying the median filter iteratively results in well sighted resulting images and despite the implementation simplicity, the proposed modified median filtering scheme provides a considerably higher convergence speed which intends a lower numerical complexity.
Abstract: Iterative median filtering for restoration of images corrupted by impulsive noise is considered. A modified version of median filtering that can be also applied iteratively is also proposed. The methods are compared with an iterative method suggested by Marvasti for images with high pixel loss. The results show that applying the median filter iteratively results in well sighted resulting images. Moreover, despite the implementation simplicity, the proposed modified median filtering scheme provides a considerably higher convergence speed which intends a lower numerical complexity.

Journal ArticleDOI
TL;DR: A new adaptive filtering algorithm based upon QR-decomposition for optimal multichannel filtering with an "unknown" desired signal, as well as its application to multi-channel acoustic noise reduction.