scispace - formally typeset
Search or ask a question

Showing papers on "Median filter published in 1998"


Journal ArticleDOI
TL;DR: A new watermarking algorithm is presented: the method, which operates in the frequency domain, embeds a pseudo-random sequence of real numbers in a selected set of DCT coefficients, which is adapted to the image by exploiting the masking characteristics of the human visual system, thus ensuring the watermark invisibility.

743 citations


Journal ArticleDOI
TL;DR: The new weighted median filter formulation leads to significantly more powerful estimators capable of effectively addressing a number of fundamental problems in signal processing that could not adequately be addressed by prior weighted median smoother structures.
Abstract: Weighted median smoothers, which were introduced by Edgemore in the context of least absolute regression over 100 years ago, have received considerable attention in signal processing during the past two decades. Although weighted median smoothers offer advantages over traditional linear finite impulse response (FIR) filters, it is shown in this paper that they lack the flexibility to adequately address a number of signal processing problems. In fact, weighted median smoothers are analogous to normalized FIR linear filters constrained to have only positive weights. It is also shown that much like the mean is generalized to the rich class of linear FIR filters, the median can be generalized to a richer class of filters admitting positive and negative weights. The generalization follows naturally and is surprisingly simple. In order to analyze and design this class of filters, a new threshold decomposition theory admitting real-valued input signals is developed. The new threshold decomposition framework is then used to develop fast adaptive algorithms to optimally design the real-valued filter coefficients. The new weighted median filter formulation leads to significantly more powerful estimators capable of effectively addressing a number of fundamental problems in signal processing that could not adequately be addressed by prior weighted median smoother structures.

183 citations


Proceedings ArticleDOI
14 Apr 1998
TL;DR: Preliminary results show the efficiency of the combination of color segmentation and of invariant moments in detecting faces with a large variety of poses and against relatively complex backgrounds.
Abstract: We use a skin color model based on the Muhulanobis metric and a shape analysis based on invariant moments to automatically detect and locate human faces in two-dimensional natural scene images. First, color segmentation of an input image is performed by thresholding in a perceptually plausible hue-saturation color space where the effects of the variability of human skin color and the dependency of chrominance on changes in illumination are reduced. We then group regions of the resulting binary image which have been classified as candidates into clusters of connected pixels. Performing median filtering on the image and discarding the smallest remaining clusters ensures that only a small number of clusters will be used for further analysis. Fully translation-, scale- anti in-plane rotation invariant moments are calculated for each remaining cluster. Finally, in order to distinguish faces from distractors, a multilayer perceptron neural network is used with the invariant moments as the input vector. Supervised learning of the network is implemented with the backpropagation algorithm, at first for frontal views of faces. Preliminary results show the efficiency of the combination of color segmentation and of invariant moments in detecting faces with a large variety of poses and against relatively complex backgrounds.

165 citations


DissertationDOI
01 Jan 1998
TL;DR: In this article, a 3D multilevel median structure was proposed to suppress the impulsive distortion (Dirt and Sparkle or Blotches) and noise degradation in video sequences.
Abstract: This dissertation presents algorithms for restoring some of the major corruptions observed in archived film or video material. The two principal problems of impulsive distortion (Dirt and Sparkle or Blotches) and noise degradation are considered. There is also an algorithm for suppressing ·the inter-line jitter common in images decoded from noisy video signals. In the case of noise reduction and Blotch removal the thesis considers image sequences to be three dimensional signals involving evolution of features in time and space. This is necessary if any process presented is to show an improvement over standard two-dimensional techniques . It is important to recognize that consideration of image sequences must involve an appreciation of the problems incurred by the motion of objects in the scene. The most obvious implication is that due to motion, useful three dimensional processing does not necessarily proceed in a direction 'orthogonal' to the image frames. Therefore, attention is given to discussing motion estimation as it is used for image sequence processing. Some discussion is given to image sequence models and the 3D Autoregressive model is investigated. A multiresolution BM scheme is used for motion estimation throughout the major part of the thesis. Impulsive noise removal in image processing has been traditionally achieved by the use of median filter structures. A new three dimensional multilevel median structure is presented in this work with the additional use of a detector which limits the distortion caused by the filters . This technique is found to be extremely effective in practice and is an alternative to the traditional global median operation. The new median filter is shown to be superior to those previously presented with respect to the ability to reject the kind of distortion found in practice. A model based technique using the 3D AR model is also developed for detecting and removing Blotches. This technique achieves better fidelity at the expense of heavier computational load. Motion compensated 3D UR and FIR Wiener filters are investigated with respect to their ability to reject noise in an image sequence. They are compared to several algorithms previously presented which are purely temporal in nature. The filters presented are found to be effective and compare favourably to the other algorithms. The 3D filtering process is superior to the purely temporal process as expected. The algorithm that is presented for suppressing inter-line jitter uses a 2D AR model to estimate and correct the relative displacements between the lines. The output image is much more satisfactory to the observer although in a severe case some drift of image features is to be expected. A suggestion for removing this drift is presented in "the conclusions. There are several remaining problems in moving video. In particular, line scratches and picture shake/roll. Line scratches cannot be detected successfully by the detectors presented and so cannot be removed efficiently. Suppressing shake and roll involves compensating the entire frame for motion and there is a need to separate global from local motion. These difficulties provide ample opportunity for further research.

155 citations


Journal ArticleDOI
TL;DR: This work proposes a new adaptive-neighborhood approach to filtering images corrupted by signal-dependent noise that provides better noise suppression as indicated by lower mean-squared errors as well as better retention of edge sharpness than the other approaches considered.
Abstract: In many image-processing applications the noise that corrupts the images is signal dependent, the most widely encountered types being multiplicative, Poisson, film-grain, and speckle noise. Their common feature is that the power of the noise is related to the brightness of the corrupted pixel. This results in brighter areas appearing to be noisier than darker areas. We propose a new adaptive-neighborhood approach to filtering images corrupted by signal-dependent noise. Instead of using fixed-size, fixed-shape neighborhoods, statistics of the noise and the signal are computed within variable-size, variable-shape neighborhoods that are grown for every pixel to contain only pixels that belong to the same object. Results of adaptive-neighborhood filtering are compared with those given by two local-statistics-based filters (the refined Lee filter and the noise-updating repeated Wiener filter), both in terms of subjective and objective measures. The adaptive-neighborhood approach provides better noise suppression as indicated by lower mean-squared errors as well as better retention of edge sharpness than the other approaches considered.

122 citations


Journal ArticleDOI
TL;DR: An automated approach for determining the noise associated with astronomical images is described, based on a multiresolution transform of the image, the atrouswavelet transform, which can be used for high-quality image filtering or compression.
Abstract: We describe an automated approach for determining the noise associated with astronomical images. Detector noise is ever present and must be determined for high-quality image filtering or compression. We also show that the method can be used for very high quality cosmic-ray hit removal. Our method is based on a multiresolution transform of the image, the atrouswavelet transform. We present a range of examples and applications to illustrate the effectiveness of this approach.

107 citations


Proceedings ArticleDOI
22 Sep 1998
TL;DR: Rules for coefficient estimation for edge- preserving models which are particularly effective for edge preservation and noise suppression are presented, using a predictive technique analogous to estimation of the weights of optimal weighted median filters.
Abstract: NonGaussian Markov image models are effective in the preservation of edge detail in Bayesian formulations of restoration and reconstruction problems. Included in these models are coefficients quantifying the statistical links among pixels in local cliques, which are typically assumed to have an inverse dependence on distance among the corresponding neighboring pixels. Estimation of these coefficients is a nontrivial task for Non Gaussian models. We present rules for coefficient estimation for edge- preserving models which are particularly effective for edge preservation and noise suppression, using a predictive technique analogous to estimation of the weights of optimal weighted median filters.© (1998) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

99 citations


Journal ArticleDOI
TL;DR: A new algorithm is presented that can remove impulse noise from corrupted images while preserving details using information not just of a local window centered about the corrupted pixel, but also of some remote regions in the image.
Abstract: We present a new algorithm that can remove impulse noise from corrupted images while preserving details. The algorithm is fundamentally different from the traditional methods in that it can utilize information not just of a local window centered about the corrupted pixel, but also of some remote regions in the image. Computer simulations indicate that our algorithm outperforms many existing techniques.

82 citations


Patent
20 May 1998
TL;DR: In this article, a method for detecting motion between a reference image and a test image by acquiring the images, aligning the images; dividing the images into blocks; masking certain blocks; differencing corresponding blocks; median filtering the differences; low-pass filtering the outputs of the median filter; generating a normalized histogram for each output of the low pass filter; calculating the distance between the noise model and each normalized histograms; comparing each distance calculated to a user-definable threshold; and determining if motion has occurred between the images if a certain number of distance calculations
Abstract: A device for and method of detecting motion between a reference image and a test image by acquiring the images; aligning the images; dividing the images into blocks; masking certain blocks; differencing corresponding blocks; median filtering the differences; low-pass filtering the outputs of the median filter; generating a normalized histogram for each output of the low-pass filter; generating a model of gaussian noise; calculating the distance between the noise model and each normalized histogram; comparing each distance calculated to a user-definable threshold; and determining if motion has occurred between the images if a certain number of distance calculations are at or above the user-definable threshold. If a scene is to be continuously monitored and no motion occurred between the previous reference image and the previous test image then a new test image is acquired and compared against the previous reference image as described above. If the scene is to be continuously monitored and motion has occurred between the previous reference image and the previous test image then replace the previous reference image with the previous test image, acquire a new test image, and compare the new test image to the new reference image as described above.

69 citations


Journal ArticleDOI
TL;DR: The Optimal Robust Separator (ORS) technique allows a unique and objective determination of the regional field and enables researchers to undertake reproducible separations of regional and residual components.
Abstract: Prior to interpretation and further analysis, many datasets must first be separated into regional and residual components. Traditional techniques are either subjective (e.g., graphical methods) or nonrobust (e.g., all least-squares based methods). Bathymetric data, with their broad spectrum, pose serious difficulties to these traditional methods, in particular those based on spectral decomposition. Spatial median filters offer a solution that is robust, objective, and often defines regional components similar to those produced graphically by hand. Characteristics of spatial median filters in general are discussed and a new empirical method is presented for determining the width of the robust median filter that accomplishes an optimal separation of a gridded dataset into its regional and residual components. The method involves tracing the zero-contour of the residual component and evaluating the ratio between the volume enclosed by the surface inside this contour and the contour's area. The filter width giving the highest ratio (or mean amplitude) is called the Optimal Robust Separator (ORS) and is selected as the optimal filter producing the best separation. The technique allows a unique and objective determination of the regional field and enables researchers to undertake reproducible separations of regional and residual components. The ORS method is applied to both synthetic data and bathymetry/topography of the Hawaiian Islands; ways to improve the technique using alternative diagnostic quantities are discussed.

66 citations


Journal ArticleDOI
TL;DR: This work introduces filters for angular signals, and introduces three variations for the extension of quasirange to circular data, which have good and user-controlled properties as edge detectors in noisy angular signals.
Abstract: Physical quantities referring to angles, like vector direction, color hue, etc., exhibit an inherently periodic nature. Due to this periodicity, digital filters and edge operators proposed for data on the line cannot be applied on such data. We introduce filters for angular signals (circular mean, circular median, circular a-trimmed mean, circular modified trimmed mean). Particular emphasis is given to the circular median filter, for which some interesting properties are derived. We also use estimators of circular dispersion to introduce edge detectors for angular signals. Three variations for the extension of quasirange to circular data are proposed, and expressions for their output PDF are derived. These "circular" quasiranges have good and user-controlled properties as edge detectors in noisy angular signals. The performance of the proposed edge operators is evaluated on angular edges, using certain quantitative criteria. Finally, a series of experiments featuring one-dimensional (1-D) angular signals and hue images is used to illustrate the operation of the new filters and edge detectors.

Patent
20 May 1998
TL;DR: In this paper, a noise detection system having a baseline wander filter (16), high and low pass filters (26), and adaptive line noise canceler (20), and various noise detectors is provided to identify, signal, and remove contamination form an ECG signal.
Abstract: A noise detection system having a baseline wander filter (16), high and low pass filters (26), and adaptive line noise canceler (20), and various noise detectors is provided to identify, signal, and remove contamination form an ECG signal. The ECG signal is conditioned to remove various portions of the ECG signal prior to processing in various noise detectors while minimizing the signal conditioning effect of the filters on the ECG signal.

Patent
Kjell Norén1
17 Nov 1998
TL;DR: In this article, median filtering is used to remove data outliers produced by GSM disturbance picked up by an implanted cardiac stimulator lead and superimposed on the sensed cardiac activity signal, the signal from the implanted lead is subjected to median filtering.
Abstract: In an apparatus for removing data outliers in measured signals in an implanted medical apparatus, such as data outliers produced by GSM disturbance picked up by an implanted cardiac stimulator lead and superimposed on the sensed cardiac activity signal, the signal from the implanted lead is subjected to median filtering. The median filtering minimizes, or eliminates, the effect of highly aberrational data points in the incoming signal, without the necessity of actually removing the components in the signal produced by the disturbance from the incoming signal itself. Since no portion of the actual incoming signal is removed by the median filtering, the data integrity of the sensed cardiac signal is preserved. The signal processed by median filtering, possibly subjected to subsequent post filtering, is then supplied to a detector within the implanted stimulator, which performs the desired detection on the filtered signal, with the result of the detection then being used to control operation of the implanted stimulator.

Proceedings ArticleDOI
24 Nov 1998
TL;DR: A novel method based on the kFill algorithm that can be accomplished in single-pass scan over the image to remove simultaneously both salt noise and pepper noise of any sizes that are smaller than the size of document objects is proposed.
Abstract: Documents containing text and graphics components are usually acquired as binary images for computer processing purposes. Salt-and-pepper noise is a prevalent artifact in such images. Removing this noise usually requires iterative or multiple-pass processing, some techniques even cause distortions in document components. In this paper, we propose a novel method based on the kFill algorithm that can be accomplished in single-pass scan over the image. The algorithm is capable of removing simultaneously both salt noise and pepper noise of any sizes that are smaller than the size of document objects. Results of the proposed method are given in comparison with the well-known morphological operations.

Proceedings ArticleDOI
04 Oct 1998
TL;DR: This work presents a fast, non-iterative technique for producing grayscale images from error diffused and dithered halftones and compares it to the best reported statistical smoothing, wavelet, and Bayesian algorithms to show that it delivers comparable PSNR and subjective quality at a fraction of the computation and memory requirements.
Abstract: We present a fast, non-iterative technique for producing grayscale images from error diffused and dithered halftones. The first stage of the algorithm consists of a Gaussian filter and a median filter, while the second stage consists of a bandpass filter, a thresholding operation, and a median filter. The second stage enhances the rendering of edges in the inverse halftone. We compare our algorithm to the best reported statistical smoothing, wavelet, and Bayesian algorithms to show that it delivers comparable PSNR and subjective quality at a fraction of the computation and memory requirements. For error diffused halftones, our technique is seven times faster than the MAP estimation method and 75 times faster than the wavelet method. For dithered halftones, our technique is 200 times faster than the MAP estimation method. A C implementation of the algorithm is available.

Proceedings ArticleDOI
12 May 1998
TL;DR: A system for machine recognition of music patterns in the sense that an error between a target pattern and scanned pattern is minimized and the error takes into account pitch and rhythm information.
Abstract: We introduce a system for machine recognition of music patterns. The problem is put into a pattern recognition framework in the sense that an error between a target pattern and scanned pattern is minimized. The error takes into account pitch and rhythm information. The pitch error measure consists of an absolute (objective) error and a perceptual error. The latter depends on an algorithm for establishing the tonal context which is based on Krumhansl's (1990) key-finding algorithm. The sequence of maximum correlations that it outputs is smoothed with a cubic spline and is used to determine weights for perceptual and absolute pitch errors. Maximum correlations are used to create the assigned key sequence, which is then filtered by a recursive median filter to improve the structure of the output of the key finding algorithm. A procedure for choosing weights given to pitch and rhythm errors is discussed.

Proceedings ArticleDOI
15 Feb 1998
TL;DR: A digital filter structure is proposed to maximally remove noise from the ECG signals based on cascading a zero-phase bandpass, an adaptive filter, and multi-band-pass filter that has low implementation complexity and introduces little noise into a typical ECG.
Abstract: A digital filter structure is proposed to maximally remove noise from the ECG signals. This structure is based on cascading a zero-phase bandpass, an adaptive filter, and multi-band-pass filter. It provides an efficient method for removing noise from the ECG signals. This filter structure has low implementation complexity and introduces little noise into a typical ECG. It can be applied to real-time applications particularly automatic cardiac arrhythmia classifiers.

Journal ArticleDOI
H. Kong1, L. Guan1
TL;DR: In this article, a class of noise exclusive adaptive filters for removing impulse noise from digital images is developed and analyzed, based on a self-organizing neural network and noise excluding estimation.
Abstract: A class of noise-exclusive adaptive filters for removing impulse noise from digital images is developed and analyzed in this brief. The filtering scheme is based on noise detection using a self-organizing neural network and noise excluding estimation. These filters suppress impulse noise effectively while preserving fine image details. Applications of the filters to several images show that their properties of efficient impulse noise suppression, edges and fine detail preservation, minimum signal distortion, or minimum mean square error are better than those of the traditional median-type filters.

Proceedings ArticleDOI
04 Oct 1998
TL;DR: It is shown that a significant subjective improvement in the restored image quality as well as a consistent reduction in the objectively measured mean absolute error and mean square error is obtained.
Abstract: We introduce a new class of nonlinear filters called median-rational hybrid filters (MRHF) based on rational functions (RF). The filter output is the result of a rational operation taking into account three sub-functions, such as two FIR or median sub-filters and one center weighted median filter (CWMF). The proposed MRHF filters have the inherent property that on smooth areas they provide good noise attenuation whereas on changing areas the noise attenuation is traded for good response to the change. The performance of the proposed filter is compared against widely known nonlinear filters such as: morphological signal adaptive median filters, stack filters, rank-order morphological filters and simple rational filters. It is shown that a significant subjective improvement in the restored image quality as well as a consistent reduction in the objectively measured mean absolute error and mean square error is obtained.

Proceedings ArticleDOI
12 May 1998
TL;DR: A noise filtering scheme, which is based on a multichannel homomorphic transformation, for color photographic images corrupted by signal-dependent film grain noise is proposed, and experimental results are provided.
Abstract: In this paper, we propose a noise filtering scheme, which is based on a multichannel homomorphic transformation, for color photographic images corrupted by signal-dependent film grain noise. The proposed method performs the estimation of the noise parameter using higher-order statistics (skewness or kurtosis) of the corrupted image and filtered image statistics. This parameter estimation technique can be used to generate color film grain noise that has applications in motion picture productions. After a theoretical description of the method employed, experimental results are provided.

Proceedings ArticleDOI
01 Sep 1998
TL;DR: A theoretical study on the performance of several multichannel noise reduction algorithms and proposes to measure the spatial complex coherence (CC) or normalized cross power spectrum of the sound field and shows that it can be used as a much more general tool for performance analysis.
Abstract: In this contribution we present a theoretical study on the performance of several multichannel noise reduction algorithms. It is known that the magnitude squared coherence (MSC) determines the performance of a class of adaptive algorithms i.e. active noise control or noise reduction with a reference microphone. However, the MSC is not sufficient for performance evaluation of other noise reduction methods like the Generalized Sidelobe Canceller (GSC) or adaptive post-filter techniques. We propose to measure the spatial complex coherence (CC) or normalized cross power spectrum of the sound field and show that it can be used as a much more general tool for performance analysis. First of all, we summarize the results of previous studies and present new results for the performance of the Generalize Sidelobe Canceller (GSC) as a function of the complex coherence. In the second part we examine different noise fields to show theoretical limits of multichannel noise reduction schemes.

Journal ArticleDOI
TL;DR: This paper introduces a novel, data-adaptive filtering framework: affine order-statistic filters, and focuses on a representative of the WOS affine filter class-the median affinefilter-whose behavior can be tuned from that of a linear FIR filter to that of an robust median filter by narrowing the affinity function to a process referred to as medianization.
Abstract: This paper introduces a novel, data-adaptive filtering framework: affine order-statistic filters. Affine order-statistic filters operate effectively on a wide range of signal statistics, are sensitive to the dispersion of the observed data, and are therefore particularly useful in the processing of nonstationary signals. These properties result from the introduction of a tunable affinity function that measures the affinity, or closeness, of observation samples in their natural order to their corresponding order statistics. The obtained affinity measures are utilized to control the influence of individual samples in the filtering process. Depending on the spread of the affinity function, which is controlled by a single parameter /spl gamma/, affine order-statistic filters operate effectively in various environments ranging from Gaussian to impulsive. The class of affine order-statistic filters subsumes the family of weighted order-statistic (WOS) affine filters and the class of FIR affine filters. We focus on a representative of the WOS affine filter class-the median affine filter-whose behavior can be tuned from that of a linear FIR filter to that of a robust median filter by narrowing the affinity function to a process referred to as medianization. The superior performance of affine order-statistic filters is demonstrated in two applications.

Proceedings ArticleDOI
02 Jun 1998
TL;DR: The presented scheme consists mainly of a subband based recursive temporal fil ter adapted to special properties of the human visual system, which consists of an image analysing highpass fil ter bank and an adaptive lowpass FIR-filter for noise reduction.
Abstract: The reduction of gaussian noise is still an impor tant task in video systems. The presented scheme consists mainly of a subband based recursive temporal fil ter adapted to special properties of the human visual system. This fil ter is suppor ted by a spatial fil ter with low hardware expense, which consists of an image analysing highpass fil ter bank and an adaptive lowpass FIR-filter for noise reduction.

Proceedings ArticleDOI
05 Oct 1998
TL;DR: The new method can segment the regions of focal liver disease in sonograms with accuracy, and it can be useful as a preprocessing step in the scheme for automated classification of focal hepatitis in sonography.
Abstract: We have developed a simple, yet robust method for segmentation of low-contrast objects embedded in noisy images. Our technique has been applied to segmenting of liver tumors in B-scan ultrasound images with hypoechoic rims. In our method, first a B-scan image is processed by a median filter for removal of speckle noise. Then several one-dimensional profiles are obtained along multiple radial directions which pass through the manually identified center of the region of a tumor. After smoothing by a Gaussian kernel smoother, these profiles are processed by Sombrero's continuous wavelets to yield scalograms over a range of scales. The modulus maxima lines, which represent the degree of regularity at individual points on the profiles, are then utilized for identifying candidate points on the boundary of the tumor. These detected boundary points are fitted by an ellipse and are used as an initial configuration of a wavelet snake. The wavelet snake is then deformed so that the accurate boundary of the tumor is found. A preliminary result for several metastases with various sizes of hypoechoic rims showed that our method could extract boundaries of the tumors which were close to the contours drawn by expert radiologists. Therefore, our new method can segment the regions of focal liver disease in sonograms with accuracy, and it can be useful as a preprocessing step in our scheme for automated classification of focal liver disease in sonography.

Journal ArticleDOI
TL;DR: The TEXSOM-architecture, a texture segmentation architecture based on the joint spatial/spatial-frequency paradigm, is described, which overcomes some drawbacks of similar architectures, such as the large size of the filter bank and the necessity of a priori knowledge to determine the filters' parameters.

Journal ArticleDOI
TL;DR: In this article, an image processing software package was developed to classify wheat varieties and achieved an overall 69% classification on a total of 31 bread wheat varieties used in this work, and the features determined by the fitting procedure, as well as those determined from direct extraction, are used in the classification.

Journal ArticleDOI
TL;DR: Based on the previously developed sorting networks, a new VLSI architecture suitable for two-dimensional (2-D) rank-order filtering is proposed, with the major advantage of the proposed architecture is their fast response time, modular architecture, and simple and regular interconnection.
Abstract: Based on the previously developed sorting networks, a new VLSI architecture suitable for two-dimensional (2-D) rank-order filtering is proposed. The major advantage of the proposed architecture is their fast response time, modular architecture, and simple and regular interconnection. Generally speaking, the throughput of the proposed architecture is (N-1) times faster than using a one-dimensional rank-order filter for 2-D N/spl times/N data. The concept of block processing is also incorporated into the design to reduce the time-area complexity of the proposed architecture. Roughly speaking, the complexity is reduced to 2/3 and 1/2 compared with a rank-order and median filter without using a block processing architecture, respectively. A 3/spl times/3 median filter with block processing architecture is implemented through a 0.8 /spl mu/m single-poly double-metal CMOS process. The results are correct with a clock rate up to 125 MHz.

Journal ArticleDOI
TL;DR: The complexity of the most commonly used multivariate median filters is thoroughly analyzed and theoretical results are derived and validated against experimental data, proving that computational complexity depends mainly on the approach adopted to sort multivariate samples.

Journal ArticleDOI
TL;DR: A novel filtering algorithm is presented to restore images corrupted by impulsive noise by modeling a local region with a polynomial that can best approximate the region under the condition of least squared error, and an adaptive approach is introduced to automatically deter- mine the orders of the polynomials.
Abstract: A novel filtering algorithm is presented to restore images cor- rupted by impulsive noise. As a preprocessing procedure of the noise cancellation filter, an improved impulse detector is used to generate a binary flag image, which gives each pixel a flag indicating whether it is an impulse. This flag image has two uses: (1) a pixel is modified only when it is considered as an impulse; otherwise, it is left unchanged, and (2) only the values of the good pixels are employed as useful information by the noise cancellation filter. To remove noises from the corrupted image, we propose a new filter called a polynomial approximation (PA) filter, which is developed by modeling a local region with a polynomial that can best approximate the region under the condition of least squared error. Furthermore, an adaptive approach is introduced to automatically deter- mine the orders of the polynomials. The proposed two kinds of PA filters, fixed-order and adaptive-order PA filters, are tested on images corrupted by both fixed-valued and random-valued impulsive noise. Major improve- ments are obtained in comparison with other state-of-the-art algorithms. © 1998 Society of Photo-Optical Instrumentation Engineers. (S0091-3286(98)02904-3) Subject terms: image enhancement; image filter; impulse noise; noise detection; noise filtering; polynomial approximation.

Journal ArticleDOI
TL;DR: From a visual inspection of restored images, it is clear that the proposed adaptive- neighborhood filter provides greater noise suppression than fixed- neighborhood restoration methods.
Abstract: Multiplicative noise is a type of signal-dependent noise where brighter areas of the images appear noisier. A popular class of image restoration methods is based on local mean, median, and variance. However, simple 333 filters do not take the nonstationary nature of the image and/or noise into account, and the restoration achieved by such filters may not be effective. We present a new adaptive-neighborhood or region-based noise filtering technique for restoring images with multiplicative noise. The method is based on finding variable-shaped, variable-sized adaptive neighborhoods for each pixel in the image, followed by the application of a filter spe- cifically designed for multiplicative noise based on statistical param- eters computed over the adaptive neighborhoods. From a visual inspection of restored images, it is clear that the proposed adaptive- neighborhood filter provides greater noise suppression than fixed- neighborhood restoration methods. The proposed method, unlike fixed-neighborhood methods, does not blur or clip object boundaries or corners. The mean squared errors between the results of the proposed method and the original images are considerably lower than those for results of the fixed-neighborhood methods studied, indicating that the image and noise statistics are better estimated by the adaptive-neighborhood method. © 1998 SPIE and IS&T.