scispace - formally typeset
Search or ask a question

Showing papers on "Median filter published in 2007"


Journal ArticleDOI
TL;DR: A new decision-based algorithm is proposed for restoration of images that are highly corrupted by impulse noise that removes the noise effectively even at noise level as high as 90% and preserves the edges without any loss up to 80% of noise level.
Abstract: A new decision-based algorithm is proposed for restoration of images that are highly corrupted by impulse noise. The new algorithm shows significantly better image quality than a standard median filter (SMF), adaptive median filters (AMF), a threshold decomposition filter (TDF), cascade, and recursive nonlinear filters. The proposed method, unlike other nonlinear filters, removes only corrupted pixel by the median value or by its neighboring pixel value. As a result of this, the proposed method removes the noise effectively even at noise level as high as 90% and preserves the edges without any loss up to 80% of noise level. The proposed algorithm (PA) is tested on different images and is found to produce better results in terms of the qualitative and quantitative measures of the image

679 citations


Journal ArticleDOI
TL;DR: Extensive simulations show that the proposed filter not only can provide better performance of suppressing impulse with high noise level but can preserve more detail features, even thin lines.
Abstract: The known median-based denoising methods tend to work well for restoring the images corrupted by random-valued impulse noise with low noise level but poorly for highly corrupted images. This letter proposes a new impulse detector, which is based on the differences between the current pixel and its neighbors aligned with four main directions. Then, we combine it with the weighted median filter to get a new directional weighted median (DWM) filter. Extensive simulations show that the proposed filter not only can provide better performance of suppressing impulse with high noise level but can preserve more detail features, even thin lines. As extended to restoring corrupted color images, this filter also performs very well

460 citations


Journal ArticleDOI
TL;DR: In this correspondence, a new, simple, yet much faster, algorithm exhibiting O(1) runtime complexity is described and analyzed and compared and benchmarked against previous algorithms.
Abstract: The median filter is one of the basic building blocks in many image processing situations. However, its use has long been hampered by its algorithmic complexity O(tau) of in the kernel radius. With the trend toward larger images and proportionally larger filter kernels, the need for a more efficient median filtering algorithm becomes pressing. In this correspondence, a new, simple, yet much faster, algorithm exhibiting O(1) runtime complexity is described and analyzed. It is compared and benchmarked against previous algorithms. Extensions to higher dimensional or higher precision data and an approximation to a circular kernel are presented, as well.

319 citations


Journal ArticleDOI
TL;DR: Singular value decomposition (SVD) is a coherency-based technique that provides both signal enhancement and noise suppression as discussed by the authors, which has been implemented in a variety of seismic applications, mostly on a global scale.
Abstract: Singular value decompositionSVD is a coherency-based technique that provides both signal enhancement and noise suppression. It has been implemented in a variety of seismic applications — mostly on a global scale. In this paper, we use SVD to improve the signal-to-noise ratio of unstacked and stacked seismic sections, but apply it locally to cope with coherent events that vary with both time and offset. The local SVD technique is compared with f-x deconvolution and median filtering on a set of synthetic and real-data sections. Local SVD is better than f-x deconvolution and median filtering in removing background noise, but it performs less well in enhancing weak events or events with conflicting dips. Combining f-x deconvolution or median filtering with local SVD overcomes the main weaknesses associated with each individual method and leads to the best results.

147 citations


Book ChapterDOI
28 Mar 2007
TL;DR: A gesture recognition system for recognizing hand movements in near realtime using a infra-red time-of-flight range camera to measure 3d-surface points captured from the hand of the user and an articulated hand model is fitted to the data to refine the first estimate.
Abstract: We present a gesture recognition system for recognizing hand movements in near realtime. The system uses a infra-red time-of-flight range camera with up to 30 Hz framerate to measure 3d-surface points captured from the hand of the user. The measured data is transformed into a cloud of 3d-points after depth keying and suppression of camera noise by median filtering. Principle component analysis (PCA) is used to obtain a first crude estimate on the location and orientation of the hand. An articulated hand model is fitted to the data to refine the first estimate. The unoptimized system is able to estimate the first 7 Degrees of Freedom of the hand within 200 ms. The reconstructed hand is visualized in AVANGO/Performer and can be utilized to implement a natural man-machine interface. The work reviews relevant publications, underlines the advantages and shortcomings of the approach and provides an outlook on future improvements.

146 citations


Journal ArticleDOI
TL;DR: A digital signal processing technique that reduces the speckle content in reconstructed digital holograms based on sequential sampling of the discrete Fourier transform of the reconstructed image field is presented.
Abstract: We present a digital signal processing technique that reduces the speckle content in reconstructed digital holograms. The method is based on sequential sampling of the discrete Fourier transform of the reconstructed image field. Speckle reduction is achieved at the expense of a reduced intensity and resolution, but this trade-off is shown to be greatly superior to that imposed by the traditional mean and median filtering techniques. In particular, we show that the speckle can be reduced by half with no loss of resolution (according to standard definitions of both metrics).

129 citations


Patent
25 Jun 2007
TL;DR: In this paper, an active noise reduction system using adaptive filters is described, where the adaptive filters smoothing a stream of leakage factors is used to reduce the frequency of a noise reduction signal, which may be related to the engine speed of an engine associated with the system within which the system is operated.
Abstract: An active noise reduction system using adaptive filters. A method of operation the active noise reduction system includes smoothing a stream of leakage factors. The frequency of a noise reduction signal may be related to the engine speed of an engine associated with the system within which the active noise reduction system is operated. The engine speed signal may be a high latency signal and may be obtained by the active noise reduction system over audio entertainment circuitry.

123 citations


Journal ArticleDOI
TL;DR: Experimental results have demonstrated that the proposed filter outperforms many well-accepted median-based filters in terms of both noise suppression and detail preservation and provides excellent robustness at various percentages of impulsive noise.

115 citations


Journal ArticleDOI
TL;DR: It is shown that ringing can be successfully removed by the eigenimage filtering method, where GPR image is decomposed into eigenimages by singular value decomposition, and the refined techniques are definitely more effective than the simple methods for the ringing noise removal.
Abstract: Ringing is a common type of coherent noise in ground penetrating radar (GPR) data. When this kind of coherent noise is strong and is not properly removed, deeper structure may be completely masked. Ringing appears as nearly horizontal and periodic events, which are the most important features enabling us to remove the noise by signal processing. In this study, we have reviewed basic principles of various signal processing techniques to remove the ringing noise and compared their performances using field GPR data contaminated by severe ringing noise. The reviewed methods include background removal. f-k filtering, predictive deconvolution with filtering in wavenumber domain, and filtering by radon transform. Furthermore, it is shown that ringing can be successfully removed by the eigenimage filtering method, where GPR image is decomposed into eigenimages by singular value decomposition. This comparative analysis shows that the refined techniques are definitely more effective than the simple methods for the ringing noise removal with less distortion of GPR signals and each method has its own advantage as well as limitations. Moreover, preservation of the horizontally linear events from geological targets can be possible only through a kind of selective or local filtering such as the eigenimage filtering method.

98 citations


Journal ArticleDOI
TL;DR: The SMFAMF filter is an improved version of adaptive median filter (AMF) in order to reduce additive impulse noise in the images and can preserve details in the image better than AMF while suppressing additive salt&pepper or impulse type noises.

92 citations


Journal ArticleDOI
TL;DR: The proposed solution is a switching vector filter which analyzes the color difference of two pixels in the CIELAB color space using four directional operators and can effectively preserve the thin lines, fine details, and image edges.

Journal ArticleDOI
TL;DR: In this paper, a novel feature-based image watermarking scheme against desynchronization attacks is proposed and the robust feature points, which can survive various signal-processing and affine transformation, are extracted by using the Harris-Laplace detector.
Abstract: Synchronization is crucial to design a robust image watermarking scheme. In this paper, a novel feature-based image watermarking scheme against desynchronization attacks is proposed. The robust feature points, which can survive various signal-processing and affine transformation, are extracted by using the Harris-Laplace detector. A local characteristic region (LCR) construction method based on the scale-space representation of an image is considered for watermarking. At each LCR, the digital watermark is repeatedly embedded by modulating the magnitudes of discrete Fourier transform coefficients. In watermark detection, the digital watermark can be recovered by maximum membership criterion. Simulation results show that the proposed scheme is invisible and robust against common signal processing, such as median filtering, sharpening, noise adding, JPEG compression, etc., and desynchronization attacks, such as rotation, scaling, translation, row or column removal, cropping, and random bend attack, etc.

Journal ArticleDOI
TL;DR: A fast disparity and motion estimation for multi-view video coding (MVC) by adoptingively controlled a search range considering the reliability of each macroblock and calculated the difference between the predicted vectors that were obtained from different methods.
Abstract: In this paper, we propose a fast disparity and motion estimation for multi-view video coding (MVC). When implementing MVC, one of the most critical problems is heavy computational complexity caused by the large amount of information in multi-view sequences. Hence, a fast algorithm is essential. To reduce this computational complexity, we adoptively controlled a search range considering the reliability of each macroblock. In order to estimate this reliability, we calculated the difference between the predicted vectors that were obtained from different methods. When working with conventional encoders, vectors can be predicted using median filtering from causal blocks. Moreover, we calculated another predicted vector using multi-view camera geometry or the relationship between the disparity and motion vectors. We assumed that this difference indicated the reliability of the current macroblock. By using these properties, we were able to determine new search range and reduce the number of searching points within the limited window. The proposed MVC system was tested with several multi- view sequences to evaluate performance. Experimental results showed that the proposed algorithm was able to reduce processing time by maximumly 70-80% in estimation process.

Journal ArticleDOI
TL;DR: The analysis presented here indicates that the proposed filtering structure exhibits characteristics more robust than that of median and myriad filtering structures, suggesting that the statistical and deterministic properties essential to signal processing applications of the meridian filter are given.
Abstract: A broad range of statistical processes is characterized by the generalized Gaussian statistics. For instance, the Gaussian and Laplacian probability density functions are special cases of generalized Gaussian statistics. Moreover, the linear and median filtering structures are statistically related to the maximum likelihood estimates of location under Gaussian and Laplacian statistics, respectively. In this paper, we investigate the well-established statistical relationship between Gaussian and Cauchy distributions, showing that the random variable formed as the ratio of two independent Gaussian distributed random variables is Cauchy distributed. We also note that the Cauchy distribution is a member of the generalized Cauchy distribution family. Recently proposed myriad filtering is based on the maximum likelihood estimate of location under Cauchy statistics. An analogous relationship is formed here for the Laplacian statistics, as the ratio of Laplacian statistics yields the distribution referred here to as the Meridian. Interestingly, the Meridian distribution is also a member of the generalized Cauchy family. The maximum likelihood estimate under the obtained statistics is analyzed. Motivated by the maximum likelihood estimate under meridian statistics, meridian filtering is proposed. The analysis presented here indicates that the proposed filtering structure exhibits characteristics more robust than that of median and myriad filtering structures. The statistical and deterministic properties essential to signal processing applications of the meridian filter are given. The meridian filtering structure is extended to admit real-valued weights utilizing the sign coupling approach. Finally, simulations are performed to evaluate and compare the proposed meridian filtering structure performance to those of linear, median, and myriad filtering.

Journal ArticleDOI
TL;DR: The approach is based on iterative application of median filtering and shows promise for automatic noise reduction as a pre-processor for automated data analysis tools which aim at segmentation, feature extraction and pattern recognition.

Journal ArticleDOI
TL;DR: The filter properties discussed in this paper are proven and suggest that the proposed solution is a robust vector processing operator.

Book ChapterDOI
30 May 2007
TL;DR: A discrete regularization framework on weighted graphs of arbitrary topology is proposed, which unifies image and mesh filtering and leads to a family of simple nonlinear filters, parameterized by the degree p of smoothness and by the graph weight function.
Abstract: We propose a discrete regularization framework on weighted graphs of arbitrary topology, which unifies image and mesh filtering. The approach considers the problem as a variational one, which consists in minimizing a weighted sum of two energy terms: a regularization one that uses the discrete p-Laplace operator, and an approximation one. This formulation leads to a family of simple nonlinear filters, parameterized by the degree p of smoothness and by the graph weight function. Some of these filters provide a graph-based version of well-known filters used in image and mesh processing, such as the bilateral filter, the TV digital filter or the nonlocal mean filter.

Journal ArticleDOI
TL;DR: An adaptive nonlinear filtering approach in the orthogonal transform domain is proposed and analyzed for several typical noise environments in the DCT domain and is found to be competing with the state-of-the-art methods on pure additive noise corrupted images.
Abstract: This work addresses the problem of signal-dependent noise removal in images. An adaptive nonlinear filtering approach in the orthogonal transform domain is proposed and analyzed for several typical noise environments in the DCT domain. Being applied locally, that is, within a window of small support, DCT is expected to approximate the Karhunen-Loeve decorrelating transform, which enables effective suppression of noise components. The detail preservation ability of the filter allowing not to destroy any useful content in images is especially emphasized and considered. A local adaptive DCT filtering for the two cases, when signal-dependent noise can be and cannot be mapped into additive uncorrelated noise with homomorphic transform, is formulated. Although the main issue is signal-dependent and pure multiplicative noise, the proposed filtering approach is also found to be competing with the state-of-the-art methods on pure additive noise corrupted images.

Proceedings ArticleDOI
22 Oct 2007
TL;DR: This study used empirical mode decomposition (EMD) for filtering power line noise in electrocardiogram signals by adding a pseudo noise at a frequency higher than the highest frequency of the signal to filter out just the power line Noise in the first IMF.
Abstract: This study used empirical mode decomposition (EMD) for filtering power line noise in electrocardiogram signals. When the signal-to-noise (SNR) is low, the power line noise is separated out as the first intrinsic mode function (IMF), but when the SNR is high, a part of the signal along with the noise is decomposed as the first IMF. To overcome this problem, we add a pseudo noise at a frequency higher than the highest frequency of the signal to filter out just the power line noise in the first IMF. The results are compared with traditional IIR-based bandstop filtering. This technique is also implemented for filtering power line noise during enhancement of stress ECG signals.

Journal ArticleDOI
TL;DR: The Wiener filter is a solution to the restoration problem based upon the hypothesized use of a linear filter and the minimum mean-square error criterion to achieve an SNR=30dB.

Proceedings ArticleDOI
08 Oct 2007
TL;DR: A novel watermarking scheme of embedding a scrambling watermark into the green component of the color image based on DWT-SVD is proposed, which indicates that the watermark is robust to JPEG compression, cropping, Gaussian noise, median filter and resize.
Abstract: A novel watermarking scheme of embedding a scrambling watermark into the green component of the color image based on DWT-SVD is proposed. The green component is decomposed for LLn, HLn, LHn, HHn with DWT at the Nth levels. For each subband, adopt different methods to complete the watermark embedding. For the part LLn, imbed pseudo-random after spreading according to the energy. Decompose the three wavelet matrix coefficients LHn, HLn, HHn with SVD. Add the SVD of the scrambled watermarking and the SVD of the wavelet coefficients, and then the imbedding course is accomplished. The retrieving watermark algorithm and the blind detecting algorithm both are designed according to the embedding scheme. The experiment indicates that the watermark is robust to JPEG compression, cropping, Gaussian noise, median filter and resize.

Journal ArticleDOI
TL;DR: A feature-extraction and vector-generation VLSI has been developed for real-time image recognition and it is possible to scan a VGA-size image at a rate of 5 frames/sec, thus generating as many as 1.5 /spl times/ 10/sup 6/ feature vectors in a second for recognition.
Abstract: A feature-extraction and vector-generation VLSI has been developed for real-time image recognition. An arrayed-shift-register architecture has been employed in conjunction with a pipelined directional-edge-filtering circuitry. As a result, it has become possible to scan an image, pixel by pixel, with a 64 x 64-pixel recognition window and generate a 64-dimensional feature vector in every 64 clock cycles. In order to determine the threshold for edge-filtering operation adaptive to local luminance variation, a high-speed median circuit has been developed. A binary median search algorithm has been implemented using high-precision majority voting circuits working in the mixed-signal principle. A prototype chip was designed and fabricated in a 0.18-mum 5-metal CMOS technology. A high-speed feature vector generation in less than 9.7 ns/vector element has been experimentally demonstrated. It is possible to scan a VGA-size image at a rate of 6.1 frames/s, thus generating as many as 1.5 x 106 feature vectors per second for recognition. This is more than 103 times faster than software processing running on a 3-GHz general-purpose processor.

Journal ArticleDOI
TL;DR: By generalising the idea of the matrix median filters, a variety of other local matrix filters are designed, including matrix-valued mid-range filters and, more generally, M-smoothers but also weighted medians and @a-quantiles.

Journal ArticleDOI
TL;DR: A class of fuzzy metrics is used to introduce a vector filter aimed at improving the detail-preserving ability of classical vector filters while effectively removing impulsive noise.
Abstract: Classical nonlinear vector median-based filters are well-known methods for impulsive noise suppression in color images, but mostly they lack good detail-preserving ability. We use a class of fuzzy metrics to introduce a vector filter aimed at improving the detail-preserving ability of classical vector filters while effectively removing impulsive noise. The output of the proposed method is the pixel inside the filter window which maximizes the similarity in color and spatial closeness. The use of fuzzy metrics allows us to handle both criteria simultaneously. The filter is designed so that the importance of the spatial criterion can be adjusted. We show that the filter can adapt to the density of the contaminating noise by adjusting the spatial criterion importance. Classical and recent filters are used to assess the proposed filtering. The experimental results show that the proposed technique performs competitively.

Journal ArticleDOI
TL;DR: Compared to neural network based approaches, the proposed method provides a promising way for the on-line recognition of control chart patterns because of its efficient computation and robustness against outliers.
Abstract: This paper proposes a hybrid framework composed of filtering module and clustering module to identify six common types of control chart patterns, including natural pattern, cyclic pattern, upward shift, downward shift, upward trend, and downward trend. In particular, a multi-scale wavelet filter is designed for denoising and its performance is compared to single-scale filters, including mean filter and exponentially weighted moving average (EWMA) filter. Moreover, three fuzzy clustering algorithms, based on fuzzy c means (FCM), entropy fuzzy c means (EFCM) and kernel fuzzy c means (KFCM), are adopted to compare their performance of pattern classification. Experimental results demonstrate that the excellent performance of EFCM and KFCM against outliers, especially in the case of high noise level embedded in the input data. Therefore, a hybrid framework combining wavelet filter with robust fuzzy clustering is suggested and proposed in this paper. Compared to neural network based approaches, the proposed method provides a promising way for the on-line recognition of control chart patterns because of its efficient computation and robustness against outliers.

Proceedings ArticleDOI
01 Dec 2007
TL;DR: The system consists of four stages; image pre-processing, image segmentation, feature extraction, and image classification, which correctly discriminate between benign and malignant melanoma lesions was about 95% for the Artificial Neural Network and 85%" for the Support Vector Machine classifier.
Abstract: In this paper, a new intelligent method of classifying benign and malignant melanoma lesions is implemented. The system consists of four stages; image pre-processing, image segmentation, feature extraction, and image classification. As the first step of the image analysis, pre-processing techniques are implemented to remove noise and undesired structures from the images using techniques such as median filtering and contrast enhancement. In the second step, a simple thresholding method is used to segment and localise the lesion, a boundary tracing algorithm is also implemented to validate the segmentation. Then, a wavelet approach is used to extract the features, more specifically wavelet packet transform (WPT). Finally, the dimensionality of the selected features is reduced with principal component analysis (PCA) and later supplied to an artificial neural network and support vector machine classifiers for classification. The ability to correctly discriminate between benign and malignant lesions was about 95% for the Artificial Neural Network and 85% for the Support Vector Machine classifier.

Journal ArticleDOI
TL;DR: This work evaluates in detail the use of the adaptive mean filter for reducing noise in CT gel dosimetry and indicates that adaptive mean filtering is a highly effective tool for noise reduction CT gel Dosimetry.
Abstract: X-ray computed tomography (CT) as a method of extracting 3D dose information from irradiated polymer gel dosimeters is showing potential as a practical means to implement gel dosimetry in a radiation therapy clinic. However, the response of CT contrast to dose is weak and noise reduction is critical in order to achieve adequate dose resolutions with this method. Phantom design and CT imaging technique have both been shown to decrease image noise. In addition, image postprocessing using noise reduction filtering techniques have been proposed. This work evaluates in detail the use of the adaptive mean filter for reducing noise in CT gel dosimetry. Filter performance is systematically tested using both synthetic patterns mimicking a range of clinical dose distribution features as well as actual clinical dose distributions. Both low and high signal-to-noise ratio (SNR) situations are examined. For all cases, the effects of filter kernel size and the number of iterations are investigated. Results indicate that adaptive mean filtering is a highly effective tool for noise reduction CT gel dosimetry. The optimum filtering strategy depends on characteristics of the dose distributions and image noise level. For low noise images (SNR approximately 20), the filtered results are excellent and use of adaptive mean filtering is recommended as a standard processing tool. For high noise images (SNR approximately 5) adaptive mean filtering can also produce excellent results, but filtering must be approached with more caution as spatial and dose distortions of the original dose distribution can occur.

Proceedings ArticleDOI
23 Nov 2007
TL;DR: A technique to protect digital images using the content-associated copyright messages generated by combining the original copyright message with the gradient of intensity of digital images, which has a relatively low computational complexity.
Abstract: We propose a technique to protect digital images using the content-associated copyright messages generated by combining the original copyright message with the gradient of intensity of digital images The proposed technique enables the distribution of the original copyright message without any distortion of original digital images by avoiding embedment of the original copyright message into images In addition to the efficiency of generating copyright messages, it also has a relatively low computational complexity To verify the propriety of the proposed technique, we performed experiments on its robustness to the external attacks such as histogram equalization, median filtering, rotation, and cropping Experimental results on restoring the copyright message from images distorted by attacks show that more than 90%, on the average, can be recovered

Proceedings Article
01 Jan 2007
TL;DR: A novel noise fading technique based on noise detection and median filtering, which prevents image blurring and is computationally simple, is proposed in this paper and outperforms all existing impulse-denoising schemes.

Journal ArticleDOI
TL;DR: The proposed algorithm has the advantage of preventing from prematurity and fast convergence speed, and a smaller MAE for all noise levels is achieved and much detailed information of the images is preserved.