Adaptive wavelet thresholding for image denoising and compression
Summary (2 min read)
II. WAVELET THRESHOLDING AND THRESHOLD SELECTION Let the signal be
- It has been corrupted by additive noise and one observes (1) where are independent and identically distributed ( ) as normal and independent of .
- Note that since the transform is orthogonal, are also iid .
- The wavelet-thresholding denoising method filters each coefficient from the detail subbands with a threshold function (to be explained shortly) to obtain .
- The soft-thresholding rule is chosen over hard-thresholding for several reasons.
A. Adaptive Threshold for BayesShrink
- The parameter is the standard deviation and is the shape parameter.
- This suggests that also works well in the Laplacian case.
- Fig. 5 (a) compares the optimal hard-threshold, , and to the soft-thresholds and .
- (b) The optimal risks are in (1 1 1), and the approximation in (-). only on the standard deviation and not on the shape parameter , it may not yield a good approximation for values of other than the range tested here, and the threshold may need to be modified to incorporate .
B. Parameter Estimation for Data-Driven Adaptive Threshold
- This section focuses on the estimation of the GGD parameters, and , which in turn yields a data-driven estimate of that is adaptive to different subband characteristics.
- The noise variance needs to be estimated first.
- In some situations, it may be possible to measure based on information other than the corrupted image.
- Thus (19) where (20) This happens at times when is large (for example, for a grayscale image).
- To summarize, the authors refer to their method as BayesShrink which performs soft-thresholding, with the data-driven, subband-dependent threshold,.
III. MDL PRINCIPLE FOR COMPRESSION-BASED DENOISING: THE MDLQ CRITERION
- Recall their hypothesis is that compression achieves denoising because the zero-zone in the quantization step (typical in compression methods) corresponds to thresholding in denoising.
- The choice of these parameters is discussed next.
- This description can be accomplished by a two-part code: one part to describe the model and the other the description of the data using the model.
- The MDL principle chooses which minimizes the two-part code-length, (21) where is the code-length for based on , and is the code-length for .
- [28] does not address the quantization step necessary in a practical compression setting.
A. Derivation of the MDLQ Criterion
- Since the noisy wavelet transform coefficients are , where are iid , then .
- For noiseless observations, is estimated as (26) and is solved from (27) where is the kurtosis of the GGD and is estimated as The parameter values listed in Fig. 3 are estimated this way.
- Now the authors state the model selection criterion, MDLQ: EQUATION ).
- The coarsest subband is quantized differently in that it is not thresholded, and the quantization with (32) assumes the uniform distribution.
- Natarajan [25] and Liu and Moulin [21] both proposed to use compression for denoising.
IV. EXPERIMENTAL RESULTS AND DISCUSSION
- The grayscale images "goldhill," "lena," "barbara" and "baboon" are used as test images with different noise levels .
- To further justify the choice of soft-thresholding over hard-thresholding, another benchmark, Ora-cleThresh, is also computed.
- It is not surprising that the SURE threshold and the BayesShrink threshold yield similar performances.
- With more sophisticated coding methods (e.g. predictive coding, pixel classification), the same bitrate could yield a higher number of quantization level , thus resulting in a lower MSE and enhancing the performance of the MDLQ-based compression-denoise.
- Interested readers can obtain a better view of the images at the website, http://www-wavelet.eecs.berkeley.edu/~grchang/compressDenoise/.
V. CONCLUSION
- Two main issues regarding image denoising were addressed in this paper.
- Thus it would be interesting to jointly select these two values and analyze their dependencies on each other.
- Furthermore, a more sophisticated coder is likely to produce better compressed images than the current scheme, which uses the first order entropy to code the bin indices.
- Lastly, the model family could be expanded.
- In their other work [7] , it was demonstrated that spatially adaptive thresholds greatly improves the denoising performance over uniform thresholds.
Did you find this useful? Give us your feedback
Citations
2,372 citations
Cites methods from "Adaptive wavelet thresholding for i..."
...By shrinkage of transform coefficients followed by reconstruction, some reduction in image noise is observed, while edges are approximately preserved [103, 19, 20, 21, 146, 136, 70, 71, 88, 89]....
[...]
1,900 citations
Cites background from "Adaptive wavelet thresholding for i..."
...An example that illustrates the importance of shiftinvariance is image denoising by thresholding where the lack of shift-invariance causes pseudo-Gibbs phenomena around singularities [5]....
[...]
1,757 citations
1,681 citations
1,533 citations
Cites methods from "Adaptive wavelet thresholding for i..."
...Recently, adaptive data processing has gained some attention especially in the adaptive wavelet analysis for filtering and denoising [Chang et al., 2000]....
[...]
...data processing has gained some attention especially in the adaptive wavelet analysis for filtering and denoising [Chang et al., 2000]....
[...]
References
20,028 citations
16,073 citations
"Adaptive wavelet thresholding for i..." refers methods in this paper
...Fig. 9. The wavelet transform employs Daubechies’ least asymmetric compactly-supported wavelet with eight vanishing moments [ 11 ] with four scales of orthogonal decomposition....
[...]
14,157 citations
9,359 citations
"Adaptive wavelet thresholding for i..." refers background in this paper
...It has been shown to have better rates of convergence than linear methods for approximating functions in Besov spaces ([12], [13])....
[...]
...The theoretical formalization of ltering additive iidGaussian noise (of zero-mean and standarddeviation ) via thresholding wavelet coe cients was pioneered by Donoho and Johnstone [13]....
[...]
...[13] D.L. Donoho and I.M. Johnstone, \Ideal spatial adaptation via wavelet shrinkage," Biometrika, vol 81, pp.425-455, 1994....
[...]
...Since the works of Donoho and Johnstone, there has been much research on nding thresholdsfor nonparametric estimation in statistics....
[...]
...[15] D.L. Donoho, I.M. Johnstone, G. Kerkyacharian, and D. Picard, \Wavelet Shrinkage: Asymptopia?"...
[...]
8,153 citations
Related Papers (5)
Frequently Asked Questions (2)
Q2. What are the future works in "Adaptive wavelet thresholding for image denoising and compression" ?
The threshold selection uses the context-modeling idea prevelant in coding methods, thus it would be interesting to extend this spatially adaptive threshold to the compression framework, without incuring too much overhead.