scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Performance Evaluation of Iterative Denoising Algorithm Based on Variance Stabilizing Transform and Wavelet Thresholding

TL;DR: An iterative denoising algorithm based on Variance Stabilizing Transform with the conventional Wavelet Thresholding technique is combined to recover an image from a Poisson noisy observations.
Abstract: The restoration of images degraded by noise is one of the most important tasks in image processing. This paper deals with the recovery of an image from a Poisson noisy observations. More precisely, we have combined an iterative denoising algorithm based on Variance Stabilizing Transform (VST) with the conventional Wavelet Thresholding technique. At each iteration, a combination of the Poisson observations with the denoised estimate from the previous iteration is treated as scaled Poisson data and riltered through a VST scheme and wavelet thresholding. Experimental results show the effectiveness of the proposed method for denoising images corrupted by Poisson noise. Performance assessment is provided.
Citations
More filters
References
More filters
Journal ArticleDOI
TL;DR: The authors prove two results about this type of estimator that are unprecedented in several ways: with high probability f/spl circ/*/sub n/ is at least as smooth as f, in any of a wide variety of smoothness measures.
Abstract: Donoho and Johnstone (1994) proposed a method for reconstructing an unknown function f on [0,1] from noisy data d/sub i/=f(t/sub i/)+/spl sigma/z/sub i/, i=0, ..., n-1,t/sub i/=i/n, where the z/sub i/ are independent and identically distributed standard Gaussian random variables. The reconstruction f/spl circ/*/sub n/ is defined in the wavelet domain by translating all the empirical wavelet coefficients of d toward 0 by an amount /spl sigma//spl middot//spl radic/(2log (n)/n). The authors prove two results about this type of estimator. [Smooth]: with high probability f/spl circ/*/sub n/ is at least as smooth as f, in any of a wide variety of smoothness measures. [Adapt]: the estimator comes nearly as close in mean square to f as any measurable estimator can come, uniformly over balls in each of two broad scales of smoothness classes. These two properties are unprecedented in several ways. The present proof of these results develops new facts about abstract statistical inference and its connection with an optimal recovery model. >

9,359 citations


"Performance Evaluation of Iterative..." refers methods in this paper

  • ...Second, the noise is removed using a conventional wavelet thresholding [5-6]....

    [...]

  • ...The novelty of the proposed algorithm lies in its combination o f a variance stabilizing transform (VST) [4] with wavelet thresholding based on the works of Donoho and Johnson [5-6]....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors developed a spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coefficients, and achieved a performance within a factor log 2 n of the ideal performance of piecewise polynomial and variable-knot spline methods.
Abstract: SUMMARY With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle offers dramatic advantages over traditional linear estimation by nonadaptive kernels; however, it is a priori unclear whether such performance can be obtained by a procedure relying on the data alone. We describe a new principle for spatially-adaptive estimation: selective wavelet reconstruction. We show that variable-knot spline fits and piecewise-polynomial fits, when equipped with an oracle to select the knots, are not dramatically more powerful than selective wavelet reconstruction with an oracle. We develop a practical spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coefficients. RiskShrink mimics the performance of an oracle for selective wavelet reconstruction as well as it is possible to do so. A new inequality in multivariate normal decision theory which we call the oracle inequality shows that attained performance differs from ideal performance by at most a factor of approximately 2 log n, where n is the sample size. Moreover no estimator can give a better guarantee than this. Within the class of spatially adaptive procedures, RiskShrink is essentially optimal. Relying only on the data, it comes within a factor log 2 n of the performance of piecewise polynomial and variableknot spline methods equipped with an oracle. In contrast, it is unknown how or if piecewise polynomial methods could be made to function this well when denied access to an oracle and forced to rely on data alone.

8,153 citations


"Performance Evaluation of Iterative..." refers background or methods in this paper

  • ...Step 2: Apply the WThD denoiser [5] to the transformed noisy data....

    [...]

  • ...Second, the noise is removed using a conventional wavelet thresholding [5-6]....

    [...]

  • ...where WThD denotes the wavelet thresholding process proposed in [5]....

    [...]

  • ...The novelty of the proposed algorithm lies in its combination o f a variance stabilizing transform (VST) [4] with wavelet thresholding based on the works of Donoho and Johnson [5-6]....

    [...]

  • ...After applying AT, we apply a wavelet thresholding denoiser proposed by Donoho [5] to enhance the observed images in terms o f visual quality, and PSNR....

    [...]

Journal ArticleDOI
TL;DR: This work introduces optimal inverses for the Anscombe transformation, in particular the exact unbiased inverse, a maximum likelihood (ML) inverse, and a more sophisticated minimum mean square error (MMSE) inverse.
Abstract: The removal of Poisson noise is often performed through the following three-step procedure. First, the noise variance is stabilized by applying the Anscombe root transformation to the data, producing a signal in which the noise can be treated as additive Gaussian with unitary variance. Second, the noise is removed using a conventional denoising algorithm for additive white Gaussian noise. Third, an inverse transformation is applied to the denoised signal, obtaining the estimate of the signal of interest. The choice of the proper inverse transformation is crucial in order to minimize the bias error which arises when the nonlinear forward transformation is applied. We introduce optimal inverses for the Anscombe transformation, in particular the exact unbiased inverse, a maximum likelihood (ML) inverse, and a more sophisticated minimum mean square error (MMSE) inverse. We then present an experimental analysis using a few state-of-the-art denoising algorithms and show that the estimation can be consistently improved by applying the exact unbiased inverse, particularly at the low-count regime. This results in a very efficient filtering solution that is competitive with some of the best existing methods for Poisson image denoising.

341 citations


"Performance Evaluation of Iterative..." refers background in this paper

  • ...Step 3: Apply the optimal inverse AT to generate the denoised image to the original range o f y [7]....

    [...]

Journal ArticleDOI
TL;DR: With a computational cost at worst twice that of the noniterative scheme, the proposed algorithm provides significantly better quality, particularly at low signal-to-noise ratio, outperforming much costlier state-of-the-art alternatives.
Abstract: We denoise Poisson images with an iterative algorithm that progressively improves the effectiveness of variance-stabilizing transformations (VST) for Gaussian denoising filters. At each iteration, a combination of the Poisson observations with the denoised estimate from the previous iteration is treated as scaled Poisson data and filtered through a VST scheme. Due to the slight mismatch between a true scaled Poisson distribution and this combination, a special exact unbiased inverse is designed. We present an implementation of this approach based on the BM3D Gaussian denoising filter. With a computational cost at worst twice that of the noniterative scheme, the proposed algorithm provides significantly better quality, particularly at low signal-to-noise ratio, outperforming much costlier state-of-the-art alternatives.

126 citations


"Performance Evaluation of Iterative..." refers background or methods in this paper

  • ...The so-called exact unbiased inverse of a [4]:...

    [...]

  • ...This transformation step normalizes the image noise [4] and yields an image a(y):...

    [...]

  • ...The Anscombe transform (AT) converts a Poisson noise to Gaussian noise with va-riance 1 [4] so, from a mathematical viewpoint, our model after applying VST, is...

    [...]

  • ...In order to enhance the performance o f our proposed denoiser, we follow the same steps as in the paper of Lucio Azzari and Alessandro Foi [4]....

    [...]

  • ...The novelty of the proposed algorithm lies in its combination o f a variance stabilizing transform (VST) [4] with wavelet thresholding based on the works of Donoho and Johnson [5-6]....

    [...]

Journal ArticleDOI
TL;DR: The new family of Bessel K forms (BKF) densities are shown to fit very well to the observed histograms and demonstrate a high degree of match between observed and estimated prior densities using the BKF model.
Abstract: A novel Bayesian nonparametric estimator in the wavelet domain is presented. In this approach, a prior model is imposed on the wavelet coefficients designed to capture the sparseness of the wavelet expansion. Seeking probability models for the marginal densities of the wavelet coefficients, the new family of Bessel K forms (BKF) densities are shown to fit very well to the observed histograms. Exploiting this prior, we designed a Bayesian nonlinear denoiser and we derived a closed form for its expression. We then compared it to other priors that have been introduced in the literature, such as the generalized Gaussian density (GGD) or the /spl alpha/-stable models, where no analytical form is available for the corresponding Bayesian denoisers. Specifically, the BKF model turns out to be a good compromise between these two extreme cases (hyperbolic tails for the /spl alpha/-stable and exponential tails for the GGD). Moreover, we demonstrate a high degree of match between observed and estimated prior densities using the BKF model. Finally, a comparative study is carried out to show the effectiveness of our denoiser which clearly outperforms the classical shrinkage or thresholding wavelet-based techniques.

123 citations


"Performance Evaluation of Iterative..." refers background in this paper

  • ...In t r o d u c t io n Image denoising in the transform domains such as wavelet transform domain, is a vivid research problem in image processing because o f its fundamental role in many applications [1-3]....

    [...]