scispace - formally typeset
Search or ask a question
Topic

Noise reduction

About: Noise reduction is a research topic. Over the lifetime, 25121 publications have been published within this topic receiving 300815 citations. The topic is also known as: denoising & noise removal.


Papers
More filters
Journal ArticleDOI
TL;DR: Extensive computations are presented that support the hypothesis that near-optimal shrinkage parameters can be derived if one knows (or can estimate) only two parameters about an image F: the largest alpha for which FinEpsilon(q)(alpha )(L( q)(I)),1/q=alpha/2+1/2, and the norm |F|B(q) alpha)(L(Q)(I)).
Abstract: This paper examines the relationship between wavelet-based image processing algorithms and variational problems. Algorithms are derived as exact or approximate minimizers of variational problems; in particular, we show that wavelet shrinkage can be considered the exact minimizer of the following problem. Given an image F defined on a square I, minimize over all g in the Besov space B11(L1(I)) the functional |F-g|L2(I)2+λ|g|(B11(L1(I))). We use the theory of nonlinear wavelet image compression in L2(I) to derive accurate error bounds for noise removal through wavelet shrinkage applied to images corrupted with i.i.d., mean zero, Gaussian noise. A new signal-to-noise ratio (SNR), which we claim more accurately reflects the visual perception of noise in images, arises in this derivation. We present extensive computations that support the hypothesis that near-optimal shrinkage parameters can be derived if one knows (or can estimate) only two parameters about an image F: the largest α for which F∈Bqα(Lq(I)),1/q=α/2+1/2, and the norm |F|Bqα(Lq(I)). Both theoretical and experimental results indicate that our choice of shrinkage parameters yields uniformly better results than Donoho and Johnstone's VisuShrink procedure; an example suggests, however, that Donoho and Johnstone's (1994, 1995, 1996) SureShrink method, which uses a different shrinkage parameter for each dyadic level, achieves a lower error than our procedure.

810 citations

Proceedings ArticleDOI
07 May 1996
TL;DR: A new approach is then developed which achieves a trade-off between effective noise reduction and low computational load for real-time operations and demonstrates that the subjective and objective results are much better than existing methods.
Abstract: This paper addresses the problem of single microphone frequency domain speech enhancement in noisy environments. The main characteristics of available frequency domain noise reduction algorithms are presented. We have confirmed that the a priori SNR estimation leads to the best subjective results. According to these conclusions, a new approach is then developed which achieves a trade-off between effective noise reduction and low computational load for real-time operations. The obtained solutions demonstrate that the subjective and objective results are much better than existing methods.

794 citations

Journal ArticleDOI
TL;DR: A class of fourth-order partial differential equations (PDEs) are proposed to optimize the trade-off between noise removal and edge preservation, and speckles are more visible in images processed by the proposed PDEs, because piecewise planar images are less likely to mask speckling.
Abstract: A class of fourth-order partial differential equations (PDEs) are proposed to optimize the trade-off between noise removal and edge preservation. The time evolution of these PDEs seeks to minimize a cost functional which is an increasing function of the absolute value of the Laplacian of the image intensity function. Since the Laplacian of an image at a pixel is zero if the image is planar in its neighborhood, these PDEs attempt to remove noise and preserve edges by approximating an observed image with a piecewise planar image. Piecewise planar images look more natural than step images which anisotropic diffusion (second order PDEs) uses to approximate an observed image. So the proposed PDEs are able to avoid the blocky effects widely seen in images processed by anisotropic diffusion, while achieving the degree of noise removal and edge preservation comparable to anisotropic diffusion. Although both approaches seem to be comparable in removing speckles in the observed images, speckles are more visible in images processed by the proposed PDEs, because piecewise planar images are less likely to mask speckles than step images and anisotropic diffusion tends to generate multiple false edges. Speckles can be easily removed by simple algorithms such as the one presented in this paper.

772 citations

Journal ArticleDOI
TL;DR: A unified theory of neighborhood filters and reliable criteria to compare them to other filter classes are presented and it will be demonstrated that computing trajectories and restricting the neighborhood to them is harmful for denoising purposes and that space-time NL-means preserves more movie details.
Abstract: Neighborhood filters are nonlocal image and movie filters which reduce the noise by averaging similar pixels. The first object of the paper is to present a unified theory of these filters and reliable criteria to compare them to other filter classes. A CCD noise model will be presented justifying the involvement of neighborhood filters. A classification of neighborhood filters will be proposed, including classical image and movie denoising methods and discussing further a recently introduced neighborhood filter, NL-means. In order to compare denoising methods three principles will be discussed. The first principle, "method noise", specifies that only noise must be removed from an image. A second principle will be introduced, "noise to noise", according to which a denoising method must transform a white noise into a white noise. Contrarily to "method noise", this principle, which characterizes artifact-free methods, eliminates any subjectivity and can be checked by mathematical arguments and Fourier analysis. "Noise to noise" will be proven to rule out most denoising methods, with the exception of neighborhood filters. This is why a third and new comparison principle, the "statistical optimality", is needed and will be introduced to compare the performance of all neighborhood filters. The three principles will be applied to compare ten different image and movie denoising methods. It will be first shown that only wavelet thresholding methods and NL-means give an acceptable method noise. Second, that neighborhood filters are the only ones to satisfy the "noise to noise" principle. Third, that among them NL-means is closest to statistical optimality. A particular attention will be paid to the application of the statistical optimality criterion for movie denoising methods. It will be pointed out that current movie denoising methods are motion compensated neighborhood filters. This amounts to say that they are neighborhood filters and that the ideal neighborhood of a pixel is its trajectory. Unfortunately the aperture problem makes it impossible to estimate ground true trajectories. It will be demonstrated that computing trajectories and restricting the neighborhood to them is harmful for denoising purposes and that space-time NL-means preserves more movie details.

763 citations

Journal ArticleDOI
TL;DR: This paper presents the derivation of these narrow correlator spacing improvements, verified by simulated and tested performance.
Abstract: Historically, conventional GPS receivers have used 1.0 chip early-late correlator spacing in the implementation of delay lock loop s (DLLs), However, there are distinct advantages to narrowing this spacing, especially in C/A-code tracking applications. These advantages are the reduction of tracking errors in the presence of both noise and multipath. The primary disadvantage i s that a wider precorrelation bandwidth is required, coupled with higher sample rates and higher digital signal processing rates. However, with current CMOS technology, this is easily achievable and well worth the price. Noise reduction is achieved with narrower spacing because the noise components of the early and late signals are correlated and ten d to cancel, provided that early and late processing are simultaneous (not dithered). Multipath effects are reduced because the DLL discriminator is less distorted by the delayed multipath signal. This paper presents the derivation of these narrow correlator spacing improvements, verified by simulated and tested performance.

749 citations


Network Information
Related Topics (5)
Image processing
229.9K papers, 3.5M citations
90% related
Feature extraction
111.8K papers, 2.1M citations
89% related
Image segmentation
79.6K papers, 1.8M citations
88% related
Convolutional neural network
74.7K papers, 2M citations
88% related
Support vector machine
73.6K papers, 1.7M citations
88% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20231,511
20222,974
20211,123
20201,488
20191,702
20181,631