scispace - formally typeset
Search or ask a question
Topic

Median filter

About: Median filter is a research topic. Over the lifetime, 12479 publications have been published within this topic receiving 178253 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, a new method for removing impulse noises from images is proposed, which is based on replacing the central pixel value by the generalized mean value of all pixels inside a sliding window.
Abstract: A new method for removing impulse noises from images is proposed. The filtering scheme is based on replacing the central pixel value by the generalized mean value of all pixels inside a sliding window. The concepts of thresholding and complementation which are shown to improve the performance of the generalized mean filter are introduced. The threshold is derived using a statistical theory. The actual performance of the proposed filter is compared with that of file commonly used median filter by filtering noise corrupted real images. The hardware complexity of the two types of filters are also compared indicating the advantages of the generalized mean filter.

113 citations

Journal ArticleDOI
TL;DR: A new class of filters for noise attenuation is introduced and its relationship with commonly used filtering techniques is investigated and it is indicated that the new filter outperforms the VMF, as well as other techniques currently used to eliminate impulsive noise in color images.
Abstract: In this paper, we address the problem of impulsive noise reduction in multichannel images. A new class of filters for noise attenuation is introduced and its relationship with commonly used filtering techniques is investigated. The computational complexity of the new filter is lower than that of the vector median filter (VMF). Extensive simulation experiments indicate that the new filter outperforms the VMF, as well as other techniques currently used to eliminate impulsive noise in color images.

112 citations

Proceedings ArticleDOI
20 Jun 2011
TL;DR: This paper proposes an improved per-pixel confidence measure using a Random Forest regressor trained with real-world data and argues that an improved confidence measure leads to superior reconstructions in subsequent steps of traditional scan processing pipelines.
Abstract: Time-of-Flight cameras provide high-frame-rate depth measurements within a limited range of distances. These readings can be extremely noisy and display unique errors, for instance, where scenes contain depth discontinuities or materials with low infrared reflectivity. Previous works have treated the amplitude of each Time-of-Flight sample as a measure of confidence. In this paper, we demonstrate the shortcomings of this common lone heuristic, and propose an improved per-pixel confidence measure using a Random Forest regressor trained with real-world data. Using an industrial laser scanner for ground truth acquisition, we evaluate our technique on data from two different Time-of-Flight cameras1. We argue that an improved confidence measure leads to superior reconstructions in subsequent steps of traditional scan processing pipelines. At the same time, data with confidence reduces the need for point cloud smoothing and median filtering.

111 citations

Journal ArticleDOI
TL;DR: The findings indicate that the developed modification routines provide a good means of simulating the resolution and noise characteristics of digital radiographic systems for optimization or processing purposes.
Abstract: A new computer simulation approach is presented that is capable of modeling several varieties of digital radiographic systems by their image quality characteristics. In this approach, the resolution and noise characteristics of ideal supersampled input images are modified according to input modulation transfer functions (MTFs) and noise power spectra (NPS). The modification process is separated into two routines-one for modification of the resolution and another for modification of the noise characteristics of the input image. The resolution modification routine blurs the input image by applying a frequency filter described by the input MTF. The resulting blurred image is then reduced to its final size to account for the sampling process of the digital system. The noise modification routine creates colored noise by filtering the frequency components of a white noise spectrum according to the input noise power. This noise is then applied to the image by a moving region of interest to account for variations in noise due to differences in attenuation. In order to evaluate the efficacy of the modification routines, additional routines were developed to assess the resolution and noise of digital images. The MTFs measured from the output images of the resolution modification routine were within 3% of the input MTF The NPS measured from the output images of the noise modification routine were within 2% of the input NPS. The findings indicate that the developed modification routines provide a good means of simulating the resolution and noise characteristics of digital radiographic systems for optimization or processing purposes.

111 citations

Journal ArticleDOI
TL;DR: In this paper, a technique for online nonlinear filtering based on wavelet thresholding is presented for OLMS rectification, which is based on a moving window of dyadic length to remove random errors.
Abstract: Data Rectification by univariate filtering is popular for processes lacking an accurate model. Linear filters are most popular for online filtering; however, they are single-scale best suited for rectifying data containing features and noise that are at the same resolution in time and frequency. Consequently, for multiscale data, linear filters are forced to trade off the extent of noise removal with the accuracy of the features retained. In contrast, nonlinear filtering methods, such as FMH and wavelet thresholding, are multiscale, but they cannot be used for online rectification. A technique is presented for online nonlinear filtering based on wavelet thresholding. OLMS rectification applies wavelet thresholding to data in a moving window of dyadic length to remove random errors. Gross errors are removed by combining wavelet thresholding with multiscale median filtering. Theoretical analysis shows that OLMS rectification using Haar wavelets subsumes mean filters of dyadic length, while rectification with smoother boundary corrected wavelets is analogous to adaptive exponential smoothing. If the rectified measurements are not needed online, the quality of rectification can be further improved by averaging the rectified signals in each window, overcoming the boundary effects encountered in TI rectification. Synthetic and industrial data show the benefits of the online multiscale and boundary corrected translation invariant rectification methods.

111 citations


Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
92% related
Image processing
229.9K papers, 3.5M citations
91% related
Convolutional neural network
74.7K papers, 2M citations
87% related
Artificial neural network
207K papers, 4.5M citations
86% related
Deep learning
79.8K papers, 2.1M citations
85% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202372
2022186
2021276
2020387
2019478
2018538