scispace - formally typeset
Search or ask a question

Showing papers on "Median filter published in 2002"


Book
01 Jan 2002
TL;DR: Find the secret to improve the quality of life by reading this adaptive blind signal and image processing and make the words as your good value to your life.
Abstract: Find the secret to improve the quality of life by reading this adaptive blind signal and image processing. This is a kind of book that you need now. Besides, it can be your favorite book to read after having this book. Do you ask why? Well, this is a book that has different characteristic with others. You may not need to know who the author is, how well-known the work is. As wise word, never judge the words from who speaks, but make the words as your good value to your life.

1,425 citations


Journal ArticleDOI
TL;DR: A new impulse noise detection technique for switching median filters is presented, which is based on the minimum absolute value of four convolutions obtained using one-dimensional Laplacian operators, and is directed toward improved line preservation.
Abstract: A new impulse noise detection technique for switching median filters is presented, which is based on the minimum absolute value of four convolutions obtained using one-dimensional Laplacian operators. Extensive simulations show that the proposed filter provides better performance than many of the existing switching median filters with comparable computational complexity. In particular, the proposed filter is directed toward improved line preservation.

688 citations


Journal ArticleDOI
TL;DR: Data acquisition and signal processing issues relative to producing an amplitude estimate of surface EMG, and methods for estimating the amplitude of the EMG are reviewed.

586 citations


Proceedings ArticleDOI
10 Jul 2002
TL;DR: Experimental results demonstrate that the proposed mesh mean and median filtering methods are more stable than conventional Laplacian and mean curvature flows and one possible solution of the oversmoothing problem.
Abstract: This paper presents frameworks to extend the mean and median filtering schemes in image processing to smoothing noisy 3D shapes given by triangle meshes. The frameworks consist of the application of the mean and median filters to face normals on triangle meshes and the editing of mesh vertex positions to make them fit the modified normals. We also give a quantitative evaluation of the proposed mesh filtering schemes and compare them with conventional mesh smoothing methods such as Laplacian smoothing flow and mean curvature flow. The quantitative evaluation is performed in error metrics on mesh vertices and normals. Experimental results demonstrate that our mesh mean and median filtering methods are more stable than conventional Laplacian and mean curvature flows. We propose thee new mesh smoothing methods as one possible solution of the oversmoothing problem.

217 citations


Journal ArticleDOI
TL;DR: Simulations as well as real application results for EEG-signal noise elimination are included to show the validity and effectiveness of the proposed approach.
Abstract: In many applications of signal processing, especially in communications and biomedicine, preprocessing is necessary to remove noise from data recorded by multiple sensors. Typically, each sensor or electrode measures the noisy mixture of original source signals. In this paper a noise reduction technique using independent component analysis (ICA) and subspace filtering is presented. In this approach we apply subspace filtering not to the observed raw data but to a demixed version of these data obtained by ICA. Finite impulse response filters are employed whose vectors are parameters estimated based on signal subspace extraction. ICA allows us to filter independent components. After the noise is removed we reconstruct the enhanced independent components to obtain clean original signals; i.e., we project the data to sensor level. Simulations as well as real application results for EEG-signal noise elimination are included to show the validity and effectiveness of the proposed approach.

185 citations


Journal ArticleDOI
07 Aug 2002
TL;DR: In this paper, two perceptual experiments that compare the perceptual quality of the output of different demosaicing algorithms are reported and it is found that a Bayesian demosaice algorithm produced the most preferred images.
Abstract: Demosaicing is an important part of the image-processing chain for many digital color cameras. The demosaicing operation converts a raw image acquired with a single sensor array, overlaid with a color filter array, into a full-color image. In this paper, we report the results of two perceptual experiments that compare the perceptual quality of the output of different demosaicing algorithms. In the first experiment, we found that a Bayesian demosaicing algorithm produced the most preferred images. Detailed examination of the data, however indicated that the good performance of this algorithm was at least in part due to the fact that it sharpened the images while it demosaiced them. In a second experiment, we silenced image sharpness as a factor by applying a sharpening algorithm to the output of each demosaicing algorithm. The optimal amount of sharpening to be applied to each image was chosen using the results of a preliminary experiment. Once sharpness was equated in this way, an algorithm developed by Freeman based on bilinear interpolation combined with median filtering, gave the best results. An analysis of our data suggests that our perceptual results cannot be easily predicted using an image metric.

156 citations


Journal ArticleDOI
TL;DR: A new phase noise model in the complex domain is introduced and validated by using both simulated and real interferograms, where a novel noise reduction algorithm, which is not based on a windowing process and without the necessity of phase unwrapping, is addressed.
Abstract: This paper addresses the problem of interferometric phase noise reduction in synthetic aperture radar interferometry. A new phase noise model in the complex domain is introduced and validated by using both simulated and real interferograms. This noise model is also derived in the complex wavelet domain, where a novel noise reduction algorithm, which is not based on a windowing process and without the necessity of phase unwrapping, is addressed. The use of the wavelet transform allows to maintain the spatial resolution in the filtered phase image and prevents to filter low coherence areas. By using both, simulated as well as real interferometric phase images, the performance of this algorithm, in terms of noise reduction, spatial resolution maintenance, and computational efficiency, is reported and compared with other conventional filtering approaches.

155 citations


Journal ArticleDOI
TL;DR: The results show that the proposed method outperforms most of the basic algorithms for the reduction of impulsive noise in color images.

113 citations


Journal ArticleDOI
01 Apr 2002
TL;DR: A novel approach to the restoration of noise-corrupted image is presented, which is particularly effective at removing highly impulsive noise while preserving image details through a fuzzy smoothing filter constructed from a set of fuzzy membership functions for which the initial parameters are derived in accordance with input histogram.
Abstract: In this paper, we present a novel approach to the restoration of noise-corrupted image, which is particularly effective at removing highly impulsive noise while preserving image details. This is accomplished through a fuzzy smoothing filter constructed from a set of fuzzy membership functions for which the initial parameters are derived in accordance with input histogram. A principle of conservation in histogram potential is incorporated with input statistics to adjust the initial parameters so as to minimize the discrepancy between a reference intensity and the output of defuzzification process. Similar to median filters (MF), the proposed filter has the benefits that it is simple and it assumes no a priori knowledge of specific input image, yet it shows superior performance over conventional filters (including MF) for the full range of impulsive noise probability. Unlike in many neuro-fuzzy or fuzzy-neuro filters where random strategy is employed to choose initial membership functions for subsequent lengthy training, the proposed filter can achieve satisfactory performance without any training.

108 citations


Journal ArticleDOI
TL;DR: The topological median filter implements some existing ideas and some new ideas on fuzzy connectedness to improve, over a conventional median filter, the extraction of edges in noise.
Abstract: This paper describes the definition and testing of a new type of median filter for images. The topological median filter implements some existing ideas and some new ideas on fuzzy connectedness to improve, over a conventional median filter, the extraction of edges in noise. The concept of /spl alpha/-connectivity is defined and used to create an algorithm for computing the degree of connectedness of a pixel to all the other pixels in an arbitrary neighborhood. The resulting connectivity map of the neighborhood effectively disconnects peaks in the neighborhood that are separated from the center pixel by a valley in the brightness topology. The median of the connectivity map is an estimate of the median of the peak or plateau to which the center pixel belongs. Unlike the conventional median filter, the topological median is relatively unaffected by disconnected features in the neighborhood of the center pixel. Four topological median filters are defined. Qualitative and statistical analyses of the four filters are presented. It is demonstrated that edge detection can be more accurate on topologically median filtered images than on conventionally median filtered images.

106 citations


Journal ArticleDOI
TL;DR: The universal quality index, proposed in this paper to measure the effectiveness of denoising, suggests that the anisotropic median-diffusion filter can retain adherence to the original image intensities and contrasts better than other filters.
Abstract: We propose a new anisotropic diffusion filter for denoising low-signal-to-noise molecular images. This filter, which incorporates a median filter into the diffusion steps, is called an anisotropic median-diffusion filter. This hybrid filter achieved much better noise suppression with minimum edge blurring compared with the original anisotropic diffusion filter when it was tested on an image created based on a molecular image model. The universal quality index, proposed in this paper to measure the effectiveness of denoising, suggests that the anisotropic median-diffusion filter can retain adherence to the original image intensities and contrasts better than other filters. In addition, the performance of the filter is less sensitive to the selection of the image gradient threshold during diffusion, thus making automatic image denoising easier than with the original anisotropic diffusion filter. The anisotropic median-diffusion filter also achieved good denoising results on a piecewise-smooth natural image and real Raman molecular images.

Patent
Ben Weiss1
22 Apr 2002
TL;DR: In this paper, a histogram hierarchy is built using multiple layers, each layer defining a level of statistical resolution, and image data statistics are added and subtracted using a multiplicity of histograms from the hierarchy, where each histogram describes an image area.
Abstract: The invention is a method and apparatus for improving image-processing applications. Embodiments of the invention provide methods for preserving computation results and offer intermediary computation steps to allow the processing of images at any location to take advantage of previously processed image areas. The preferred embodiment offers a method for building a median filter that significantly improves the processing speed over basic techniques. By building a histogram hierarchy, image data statistics are added and subtracted using a multiplicity of histograms from the histogram hierarchy, where each histogram describes an image area. Furthermore, a histogram hierarchy is built using multiple layers, each layer defining a level of statistical resolution.

Journal ArticleDOI
TL;DR: Two new algorithms for automated processing of liquid chromatography/mass spectrometry (LC/MS) data are presented, and median filtering and vectorized peak detection provide increased robustness with respect to variation in the noise and artifact distribution compared to methods based on determining an intensity threshold for the entire dataset.
Abstract: Two new algorithms for automated processing of liquid chromatography/mass spectrometry (LC/MS) data are presented. These algorithms were developed from an analysis of the noise and artifact distribution in such data. The noise distribution was analyzed by preparing histograms of the signal intensity in LC/MS data. These histograms are well fit by a sum of two normal distributions in the log scale. One new algorithm, median filtering, provides increased performance compared to averaging adjacent scans in removing noise that is not normally distributed in the linear scale. Another new algorithm, vectorized peak detection, provides increased robustness with respect to variation in the noise and artifact distribution compared to methods based on determining an intensity threshold for the entire dataset. Vectorized peak detection also permits the incorporation of existing algorithms for peak detection in ion chromatograms and/or mass spectra. The application of these methods to LC/MS spectra of complex biological samples is described.

Journal ArticleDOI
TL;DR: In an active noise control (ANC) system using the filtered-x least mean square (FxLMS) algorithm, an online secondary path modeling method that uses an injected auxiliary noise is often applied, increasing the residual noise of the ANC system.
Abstract: In an active noise control (ANC) system using the filtered-x least mean square (FxLMS) algorithm, an online secondary path modeling method that uses an injected auxiliary noise is often applied Such a method allows quick and full-band signal-independent modeling In addition, it is suitable for multisecondary path modeling Normally, the larger the auxiliary noise, the faster an accurate model can be obtained However, it increases the residual noise of the ANC system To mitigate this problem, in this letter, a new online secondary path modeling method is proposed Rather than fixed, the power of auxiliary noise is varied according to the working status of the ANC system More specifically, the auxiliary noise is large before the ANC system converges, and becomes small when the system converges Computer simulations show its effectiveness and robustness

Journal ArticleDOI
TL;DR: A way to automatically detect the best neighborhood size for a local projective noise reduction filter using concepts from the recurrence quantification analysis in order to adaptively tune the filter along the incoming time series is proposed.
Abstract: We propose a way to automatically detect the best neighborhood size for a local projective noise reduction filter, where a typical problem is the proper identification of the noise level. Here we make use of concepts from the recurrence quantification analysis in order to adaptively tune the filter along the incoming time series. We define an index, to be computed via recurrence plots, whose minimum gives a clear indication of the best size of the neighborhood in the embedding space. Comparison of the local projective noise reduction filter using this optimization scheme with the state of the art is also provided.

Journal ArticleDOI
TL;DR: The technique has been tested on a chest phantom simulating the lungs, heart cavity and the spine, the Rando-Alderson phantom, and whole-body clinical PET studies showing a remarkable improvement in image quality, a clear reduction of noise propagation from transmission into emission data allowing for reduction of transmission scan duration.
Abstract: Segmented attenuation correction is now a widely accepted technique to reduce noise propagation from transmission scanning in positron emission tomography (PET). In this paper, we present a new method for segmenting transmission images in whole-body scanning. This reduces the noise in the correction maps while still correcting for differing attenuation coefficients of specific tissues. Based on the fuzzy C-means (FCM) algorithm, the method segments the PET transmission images into a given number of clusters to extract specific areas of differing attenuation such as air, the lungs and soft tissue, preceded by a median filtering procedure. The reconstructed transmission image voxels are, therefore, segmented into populations of uniform attenuation based on knowledge of the human anatomy. The clustering procedure starts with an overspecified number of clusters followed by a merging process to group clusters with similar properties (redundant clusters) and removal of some undesired substructures using anatomical knowledge. The method is unsupervised, adaptive and allows the classification of both pre- or post-injection transmission images obtained using either coincident 68Ge or single-photon 137Cs sources into main tissue components in terms of attenuation coefficients. A high-quality transmission image of the scanner bed is obtained from a high statistics scan and added to the transmission image. The segmented transmission images are then forward projected to generate attenuation correction factors to be used for the reconstruction of the corresponding emission scan. The technique has been tested on a chest phantom simulating the lungs, heart cavity and the spine, the Rando-Alderson phantom, and whole-body clinical PET studies showing a remarkable improvement in image quality, a clear reduction of noise propagation from transmission into emission data allowing for reduction of transmission scan duration. There was very good correlation (R2 = 0.96) between maximum standardized uptake values (SUVs) in lung nodules measured on images reconstructed with measured and segmented attenuation correction with a statistically significant decrease in SUV (17.03% +/- 8.4%, P < 0.01) on the latter images, whereas no proof of statistically significant differences on the average SUVs was observed. Finally, the potential of the FCM algorithm as a segmentation method and its limitations as well as other prospective applications of the technique are discussed.

Journal ArticleDOI
TL;DR: The foundation of the proposed filtering algorithms lies in the definition of the myriad as a tunable estimator of location derived from the theory of robust statistics, and several fundamental properties of this estimator are proved and show its optimality in practical impulsive models.
Abstract: Linear filtering theory has been largely motivated by the characteristics of Gaussian signals. In the same manner, the proposed Myriad Filtering methods are motivated by the need for a flexible filter class with high statistical efficiency in non-Gaussian impulsive environments that can appear in practice. Myriad filters have a solid theoretical basis, are inherently more powerful than median filters, and are very general, subsuming traditional linear FIR filters. The foundation of the proposed filtering algorithms lies in the definition of the myriad as a tunable estimator of location derived from the theory of robust statistics. We prove several fundamental properties of this estimator and show its optimality in practical impulsive models such as the α-stable and generalized-t. We then extend the myriad estimation framework to allow the use of weights. In the same way as linear FIR filters become a powerful generalization of the mean filter, filters based on running myriads reach all of their potential when a weighting scheme is utilized. We derive the "normal" equations for the optimal myriad filter, and introduce a suboptimal methodology for filter tuning and design. The strong potential of myriad filtering and estimation in impulsive environments is illustrated with several examples.

Journal ArticleDOI
TL;DR: A class of robust weighted median (WM) sharpening algorithms is developed that can prove useful in the enhancement of compressed or noisy images posted on the World Wide Web as well as in other applications where the underlying images are unavoidably acquired with noise.
Abstract: A class of robust weighted median (WM) sharpening algorithms is developed in this paper. Unlike traditional linear sharpening methods, weighted median sharpeners are shown to be less sensitive to background random noise or to image artifacts introduced by JPEG and other compression algorithms. These concepts are extended to include data dependent weights under the framework of permutation weighted medians leading to tunable sharpeners that, in essence, are insensitive to noise and compression artifacts. Permutation WM sharpeners are subsequently generalized to smoother/sharpener structures that can sharpen edges and image details while simultaneously filter out background random noise. A statistical analysis of the various algorithms is presented, theoretically validating the characteristics of the proposed sharpening structures. A number of experiments are shown for the sharpening of JPEG compressed images and sharpening of images with background film-grain noise. These algorithms can prove useful in the enhancement of compressed or noisy images posted on the World Wide Web (WWW) as well as in other applications where the underlying images are unavoidably acquired with noise.

Journal ArticleDOI
L. Landmann1
TL;DR: The effects of median filtering and deconvolution, two image‐processing techniques enhancing the signal‐to‐noise ratio (SNR) on the results of colocalization analysis in confocal data sets of biological specimens are examined.
Abstract: Background and noise impair image quality by affecting resolution and obscuring image detail in the low intensity range. Because background levels in unprocessed confocal images are frequently at about 30% maximum intensity, colocalization analysis, a typical segmentation process, is limited to high intensity signal and prone to noise-induced, false-positive events. This makes suppression or removal of background crucial for this kind of image analysis. This paper examines the effects of median filtering and deconvolution, two image-processing techniques enhancing the signal-to-noise ratio (SNR), on the results of colocalization analysis in confocal data sets of biological specimens. The data show that median filtering can improve the SNR by a factor of 2. The technique eliminates noise-induced colocalization events successfully. However, because filtering recovers voxel values from the local neighbourhood false-negative ('dissipation' of signal intensity below threshold value) as well as false-positive ('fusion' of noise with low intensity signal resulting in above threshold intensities), results can be generated. In addition, filtering involves the convolution of an image with a kernel, a procedure that inherently impairs resolution. Image restoration by deconvolution avoids both of these disadvantages. Such routines calculate a model of the object considering various parameters that impair image formation and are able to suppress background down to very low levels (< 10% maximum intensity, resulting in a SNR improved by a factor 3 as compared to raw images). This makes additional objects in the low intensity but high frequency range available to analysis. In addition, removal of noise and distortions induced by the optical system results in improved resolution, which is of critical importance in cases involving objects of near resolution size. The technique is, however, sensitive to overestimation of the background level. In conclusion, colocalization analysis will be improved by deconvolution more than by filtering. This applies especially to specimens characterized by small object size and/or low intensities.

Patent
19 Aug 2002
TL;DR: In this article, an adaptive noise suppression system includes an input A/D converter, an analyzer, a filter, and a output D/A converter, which includes both feed-forward and feedback signal paths that allow it to compute a filtering coefficient, which is input to the filter.
Abstract: An adaptive noise suppression system includes an input A/D converter, an analyzer, a filter, and a output D/A converter. The analyzer includes both feed-forward and feedback signal paths that allow it to compute a filtering coefficient, which is input to the filter. In these paths, feed-forward signal are processed by a signal to noise ratio estimator, a normalized coherence estimator, and a coherence mask. Also, feedback signals are processed by a auditory mask estimator. These two signal paths are coupled together via a noise suppression filter estimator. A method according to the present invention includes active signal processing to preserve speech-like signals and suppress incoherent noise signals. After a signal is processed in the feed-forward and feedback paths, the noise suppression filter estimator then outputs a filtering coefficient signal to the filter for filtering the noise out of the speech and noise digital signal.

01 Jan 2002
TL;DR: A new architecture and optimizations for Median filter implementation with FPGAs are introduced allowing real-time processing and a minimum use of resources.
Abstract: Image processing is a very important field within factory automation, and more concretely, in the automated visual inspection. The main challenge normally is the requirement of real-time results. On the other hand, in many of these applications, the existence of impulsive noise in the acquired images is one of the most habitual problems. Median filter is a robust method to remove the impulsive noise from an image. It is a computationally intensive operation, so it is hard to implement it in real time. This paper introduces a new architecture and optimizations for its implementation with FPGAs. The practical results show the effectiveness of our improvements allowing real-time processing and a minimum use of resources.

Journal ArticleDOI
TL;DR: Experimental results reveal that use of properly chosen other filters along with the wavelet filter gives better enhancement, and if the data are preprocessed by median filtering before implementing the wave let filtering, then a weakly diffusing object requires a higher-kernel filter and strongly diffuding object requires lower-kernel one.
Abstract: We have investigated the utility of wavelet filters for enhanc- ing contrast in the digital speckle pattern interferometric fringes obtained from a vibrating loudspeaker diaphragm and from the vibrating bracket of an electric motor. Experimental results reveal that use of properly chosen other filters along with the wavelet filter gives better enhancement. Our investigations also show that if the data are preprocessed by median filtering before implementing the wavelet filtering, then a weakly diffusing object requires a higher-kernel filter and strongly diffusing object requires lower-kernel one. © 2002 Society of Photo-Optical Instrumentation Engineers.

Patent
12 Jun 2002
TL;DR: In this article, an efficient and non-iterative post processing method and system is proposed for mosquito noise reduction in DCT block-based decoded images, which is based on a simple classification that segments a picture in multiple regions such as Edge, Near Edge, Flat, Near Flat and Texture regions.
Abstract: An efficient and non-iterative post processing method and system is proposed for mosquito noise reduction in DCT block-based decoded images. The post- processing is based on a simple classification that segments a picture in multiple regions such as Edge, Near Edge, Flat, Near Flat and Texture regions. The proposed technique comprises also an efficient and shape adaptive local power estimation for equivalent additive noise and provides simple noise power weighting for each above cited region. An MMSE or MMSE-like noise reduction with robust and effective shape adaptive windowing is utilized for smoothing mosquito and/or random noise for the whole image, particularly for Edge regions. Finally, the proposed technique comprises also, for chrominance components, efficient shape adaptive local noise power estimation and correction.

01 Jan 2002
TL;DR: A new frequency domain filter for periodic and quasi-periodic noise reduction is introduced, which completely eliminates periodic noise, and shows quite good results on quasi- periodic noise while completely preserves the image boundaries.
Abstract: Removal of periodic and quasi-periodic patterns from photographs is an important problem. There are a lot of sources of this periodic noise, e.g. the resolution of the scanner used to scan the image affects the high frequency noise pattern in the acquired image and can produce moire patterns. It is also characteristic of gray scale images obtained from single-chip video cameras. Usually periodic and quasi-periodic noise results peaks in image spectrum amplitude. Considering this, processing in the frequency domain is a much better solution than spatial domain operations (blurring for example, which can hide the periodic patterns at the cost of the edge sharpness reduction). A new frequency domain filter for periodic and quasi-periodic noise reduction is introduced in this paper. This filter analyzes the image spectrum amplitude using a local window, checks every spectral coefficient whether it needs the filtering and if so, replaces it with the median taken from the local window. To detect the peaks in the spectrum amplitude, a ratio of the current amplitude value to median value is used. It is shown that this ratio is stable for the non-corrupted spectral coefficients independently of the frequencies they correspond to. So it is invariant to the position of the peaks in the spectrum amplitude. This kind of filtering completely eliminates periodic noise, and shows quite good results on quasi-periodic noise while completely preserves the image boundaries.

Journal ArticleDOI
TL;DR: Two optimization techniques of weighted vector median filters, parametrized by a set of N weights, are proposed for colour image processing and evaluated by simulations related to the denoising of textured, or natural, colour images, in the presence of impulsive noise.
Abstract: Weighted vector median filters (WVMF) are a powerful tool for the non-linear processing of multi-components signals. These filters are parametrized by a set of N weights and, in this paper, we propose two optimization techniques of these weights for colour image processing. The first one is an adaptive optimization of the N−1 peripheral weights of the filter mask. The major and more difficult task is to get a mathematical expression for the derivative of the WVMF output with respect to its weights; two approximations are proposed to measure this filter output sensitivity. The second optimization technique corresponds to a global optimization of the central weight alone, the value of which is determined, in a noise reduction context, by an analytical expression depending upon the mask size and the probability of occurrence of an impulsive noise. Both approaches are evaluated by simulations related to the denoising of textured, or natural, colour images, in the presence of impulsive noise. Furthermore, as they are complementary, they are also tested when used together.

Proceedings ArticleDOI
13 May 2002
TL;DR: A noise estimation method based on spectral entropy using histogram of intensity instead of estimation methodbased on median absolute deviation (MAD) and a modified hard thresholding to alleviate time-frequency discontinuities are suggested.
Abstract: We consider the non-stationary or colored noise estimation by wavelet thresholding method. First, we propose node dependent thresholding for adaptation in colored or non-stationary noise. Next, we suggest a noise estimation method based on spectral entropy using histogram of intensity instead of estimation method based on median absolute deviation (MAD). And we use a modified hard thresholding to alleviate time-frequency discontinuities. The proposed methods are evaluated on various noise conditions - white Gaussian noise, car interior noise, F-16 cockpit noise, pink noise, and speech babble noise. We compare our proposed methods with the conventional one with level dependent thresholding based on MAD.

Journal ArticleDOI
TL;DR: In this article, a comparison of traditional linear filters popular in the jet engine industry is made with the median filter and the subfilter weighted FIR median hybrid (SWFMH) filter results with simulated data with implanted faults.

Proceedings ArticleDOI
04 Aug 2002
TL;DR: Simulation results indicate that some of these fuzzy filters show improvement over the standard median and moving average filters in reducing these three noises.
Abstract: In this paper, four fuzzy filters for filtering images contaminated with random, impulse, and sum of random and impulse noises are introduced In each of these four fuzzy filters, the output pixel of a filtered image at the center of a moving window area is defined as a normalized sum of weighted input pixels within the window Simulation results indicate that some of these fuzzy filters show improvement over the standard median and moving average filters in reducing these three noises

Proceedings ArticleDOI
07 Nov 2002
TL;DR: The presented results demonstrate the robustness of the method against some common image processing attacks such as compression, scaling, uniform or gaussian noise addition, median filtering, cropping and multiple watermarking.
Abstract: A novel scheme for the watermarking of colour images is presented in this communication. The first objective is to find the most suitable alternative to RGB color space, which is highly correlated. Colour spaces with linear relation to RGB colour space with uncorrelated components are found to be most suitable for watermarking applications. Second objective is to make the scheme adaptive to o/spl circ/c$80a/spl ring/ colour image. This is achieved by keeping the PSNR in a predefined quality range, while timing the watermark strength parameter. Watermark detection is fast and blind, i.e. only the watermark generation and coefficient randomization keys are needed, and not the original image. The presented results demonstrate the robustness of the method against some common image processing attacks such as compression, scaling, uniform or gaussian noise addition, median filtering, cropping and multiple watermarking.

Journal ArticleDOI
TL;DR: Simulation results show that the proposed filtering schemes provide improved performance over the standard recursive median filter, succeeding in preserving small details and fine textures.
Abstract: The median filter is a special case of nonlinear filters used for smoothing signals. Since the output of the median filter is always one of the input samples, it is conceivable that certain signals could pass through the median filter unaltered. These signals define the signature of a filter and are referred to as root signals. Median filters are known to possess the convergence property, meaning that by repeating median filtering a root signal will be found, starting from any input signal. By associating the nonlinear operation of median filtering with a two terms cost function, an optimization process that minimizes that function is obtained. Cost functions of the same type are associated with different recursive median filtering schemes by replacing the actual values inside the filter's window with the original signal. The convergence behavior of these filters and their smoothness are studied. By changing the positions of the replacements during filtering, a tuning effect of the smoothness is obtained. Simulation results show that the proposed filtering schemes provide improved performance over the standard recursive median filter, succeeding in preserving. small details and fine textures.