scispace - formally typeset
Search or ask a question
Author

S.G. Chang

Other affiliations: Hewlett-Packard
Bio: S.G. Chang is an academic researcher from University of California, Berkeley. The author has contributed to research in topics: Wavelet transform & Thresholding. The author has an hindex of 10, co-authored 10 publications receiving 4664 citations. Previous affiliations of S.G. Chang include Hewlett-Packard.

Papers
More filters
Journal ArticleDOI
TL;DR: An adaptive, data-driven threshold for image denoising via wavelet soft-thresholding derived in a Bayesian framework, and the prior used on the wavelet coefficients is the generalized Gaussian distribution widely used in image processing applications.
Abstract: The first part of this paper proposes an adaptive, data-driven threshold for image denoising via wavelet soft-thresholding. The threshold is derived in a Bayesian framework, and the prior used on the wavelet coefficients is the generalized Gaussian distribution (GGD) widely used in image processing applications. The proposed threshold is simple and closed-form, and it is adaptive to each subband because it depends on data-driven estimates of the parameters. Experimental results show that the proposed method, called BayesShrink, is typically within 5% of the MSE of the best soft-thresholding benchmark with the image assumed known. It also outperforms SureShrink (Donoho and Johnstone 1994, 1995; Donoho 1995) most of the time. The second part of the paper attempts to further validate claims that lossy compression can be used for denoising. The BayesShrink threshold can aid in the parameter selection of a coder designed with the intention of denoising, and thus achieving simultaneous denoising and compression. Specifically, the zero-zone in the quantization step of compression is analogous to the threshold value in the thresholding function. The remaining coder design parameters are chosen based on a criterion derived from Rissanen's minimum description length (MDL) principle. Experiments show that this compression method does indeed remove noise significantly, especially for large noise power. However, it introduces quantization noise and should be used only if bitrate were an additional concern to denoising.

2,917 citations

Journal ArticleDOI
TL;DR: A spatially adaptive wavelet thresholding method based on context modeling, a common technique used in image compression to adapt the coder to changing image characteristics, which yields significantly superior image quality and lower MSE than the best uniform thresholding with the original image assumed known.
Abstract: The method of wavelet thresholding for removing noise, or denoising, has been researched extensively due to its effectiveness and simplicity. Much of the literature has focused on developing the best uniform threshold or best basis selection. However, not much has been done to make the threshold values adaptive to the spatially changing statistics of images. Such adaptivity can improve the wavelet thresholding performance because it allows additional local information of the image (such as the identification of smooth or edge regions) to be incorporated into the algorithm. This work proposes a spatially adaptive wavelet thresholding method based on context modeling, a common technique used in image compression to adapt the coder to changing image characteristics. Each wavelet coefficient is modeled as a random variable of a generalized Gaussian distribution with an unknown parameter. Context modeling is used to estimate the parameter for each coefficient, which is then used to adapt the thresholding strategy. This spatially adaptive thresholding is extended to the overcomplete wavelet expansion, which yields better results than the orthogonal transform. Experimental results show that spatially adaptive wavelet thresholding yields significantly superior image quality and lower MSE than the best uniform thresholding with the original image assumed known.

875 citations

Proceedings ArticleDOI
04 Oct 1998
TL;DR: Experimental results show that spatially adaptive wavelet thresholding yields significantly superior image quality and lower MSE than optimal uniform thresholding.
Abstract: The method of wavelet thresholding for removing noise, or denoising, has been researched extensively due to its effectiveness and simplicity. Much of the work has been concentrated on finding the best uniform threshold or best basis. However, not much has been done to make this method adaptive to spatially changing statistics which is typical of a large class of images. This work proposes a spatially adaptive wavelet thresholding method based on context modeling, a common technique used in image compression to adapt the coder to the non-stationarity of images. We model each coefficient as a random variable with the generalized Gaussian prior with unknown parameters. Context modeling is used to estimate the parameters for each coefficient, which are then used to adapt the thresholding strategy. Experimental results show that spatially adaptive wavelet thresholding yields significantly superior image quality and lower MSE than optimal uniform thresholding.

635 citations

Proceedings ArticleDOI
09 May 1995
TL;DR: A wavelet based method which estimates the higher resolution information needed to sharpen the image and enhances the reconstructed image through alternating projections onto the sets defined by these constraints.
Abstract: One problem of image interpolation refers to magnifying a small image without loss in image clarity. We propose a wavelet based method which estimates the higher resolution information needed to sharpen the image. This method extrapolates the wavelet transform of the higher resolution based on the evolution of the wavelet transform extrema across the scales. By identifying three constraints that the higher resolution information needs to obey, we enhance the reconstructed image through alternating projections onto the sets defined by these constraints.

121 citations

Journal ArticleDOI
TL;DR: This paper combines the two operations of averaging and thresholding, finding the optimal ordering to depend on the number of available copies and on the signal-to-noise ratio.
Abstract: This correspondence addresses the recovery of an image from its multiple noisy copies. The standard method is to compute the weighted average of these copies. Since the wavelet thresholding technique has been shown to effectively denoise a single noisy copy, we consider in this paper combining the two operations of averaging and thresholding. Because thresholding is a nonlinear technique, averaging then thresholding or thresholding then averaging produce different estimators. By modeling the signal wavelet coefficients as Laplacian distributed and the noise as Gaussian, our investigation finds the optimal ordering to depend on the number of available copies and on the signal-to-noise ratio. We then propose thresholds that are nearly optimal under the assumed model for each ordering. With the optimal and near-optimal thresholds, the two methods yield similar performance, and both show considerable improvement over merely averaging.

87 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: An adaptive, data-driven threshold for image denoising via wavelet soft-thresholding derived in a Bayesian framework, and the prior used on the wavelet coefficients is the generalized Gaussian distribution widely used in image processing applications.
Abstract: The first part of this paper proposes an adaptive, data-driven threshold for image denoising via wavelet soft-thresholding. The threshold is derived in a Bayesian framework, and the prior used on the wavelet coefficients is the generalized Gaussian distribution (GGD) widely used in image processing applications. The proposed threshold is simple and closed-form, and it is adaptive to each subband because it depends on data-driven estimates of the parameters. Experimental results show that the proposed method, called BayesShrink, is typically within 5% of the MSE of the best soft-thresholding benchmark with the image assumed known. It also outperforms SureShrink (Donoho and Johnstone 1994, 1995; Donoho 1995) most of the time. The second part of the paper attempts to further validate claims that lossy compression can be used for denoising. The BayesShrink threshold can aid in the parameter selection of a coder designed with the intention of denoising, and thus achieving simultaneous denoising and compression. Specifically, the zero-zone in the quantization step of compression is analogous to the threshold value in the thresholding function. The remaining coder design parameters are chosen based on a criterion derived from Rissanen's minimum description length (MDL) principle. Experiments show that this compression method does indeed remove noise significantly, especially for large noise power. However, it introduces quantization noise and should be used only if bitrate were an additional concern to denoising.

2,917 citations

Book
24 Oct 2001
TL;DR: Digital Watermarking covers the crucial research findings in the field and explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied.
Abstract: Digital watermarking is a key ingredient to copyright protection. It provides a solution to illegal copying of digital material and has many other useful applications such as broadcast monitoring and the recording of electronic transactions. Now, for the first time, there is a book that focuses exclusively on this exciting technology. Digital Watermarking covers the crucial research findings in the field: it explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied. As a result, additional groundwork is laid for future developments in this field, helping the reader understand and anticipate new approaches and applications.

2,849 citations

Journal ArticleDOI
TL;DR: The performance of this method for removing noise from digital images substantially surpasses that of previously published methods, both visually and in terms of mean squared error.
Abstract: We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian vector and a hidden positive scalar multiplier. The latter modulates the local variance of the coefficients in the neighborhood, and is thus able to account for the empirically observed correlation between the coefficient amplitudes. Under this model, the Bayesian least squares estimate of each coefficient reduces to a weighted average of the local linear estimates over all possible values of the hidden multiplier variable. We demonstrate through simulations with images contaminated by additive white Gaussian noise that the performance of this method substantially surpasses that of previously published methods, both visually and in terms of mean squared error.

2,439 citations

Journal ArticleDOI
TL;DR: The aim of this paper is to introduce a few key notions and applications connected to sparsity, targeting newcomers interested in either the mathematical aspects of this area or its applications.
Abstract: A full-rank matrix ${\bf A}\in \mathbb{R}^{n\times m}$ with $n

2,372 citations

Journal ArticleDOI
TL;DR: This paper proposes a design framework based on the mapping approach, that allows for a fast implementation based on a lifting or ladder structure, and only uses one-dimensional filtering in some cases.
Abstract: In this paper, we develop the nonsubsampled contourlet transform (NSCT) and study its applications. The construction proposed in this paper is based on a nonsubsampled pyramid structure and nonsubsampled directional filter banks. The result is a flexible multiscale, multidirection, and shift-invariant image decomposition that can be efficiently implemented via the a trous algorithm. At the core of the proposed scheme is the nonseparable two-channel nonsubsampled filter bank (NSFB). We exploit the less stringent design condition of the NSFB to design filters that lead to a NSCT with better frequency selectivity and regularity when compared to the contourlet transform. We propose a design framework based on the mapping approach, that allows for a fast implementation based on a lifting or ladder structure, and only uses one-dimensional filtering in some cases. In addition, our design ensures that the corresponding frame elements are regular, symmetric, and the frame is close to a tight one. We assess the performance of the NSCT in image denoising and enhancement applications. In both applications the NSCT compares favorably to other existing methods in the literature

1,900 citations