scispace - formally typeset
Search or ask a question

Bilateral Filter Approach and Fast Discrete Curvelet Transform for Poisson Noise Removal from Images

TL;DR: Two methods of removing poisson noise from images using a bilateral filter and by Fast discrete Curvelet Transform (FDCT), which show that FDCT is more efficient for preserving image features, while bilateral filter is much faster and simple to implement.
Abstract: We analyse two methods of removing poisson noise from images using a bilateral filter and by Fast discrete Curvelet Transform (FDCT). The Variance stabilizing transform (VST) is the main feature of the noise removal as it converts the Poisson distribution to the Gaussian domain, which makes the noise removal process relatively simple. Once the Gaussian distribution is obtained, the bilateral filter (BF) can be used for removing noise. We can also use the FDCT instead of bilateral filter, as it is capable of sparse representation of image intrinsic features. We implement both the methods separately, compare them and demonstrate simulations for monitoring their effectiveness in poisson noise removal. The results show that FDCT is more efficient for preserving image features, while bilateral filter is much faster and simple to implement.

Content maybe subject to copyright    Report

Citations
More filters
Journal Article
TL;DR: Two technique which combines Multi-Scale Variance Stabilizing Transform, Fast Discrete Curvelet Transform with Thresholding and MS-VST, FDCT with Null Hypothesis testing for effectively removing the Poisson Noise from the medical images are proposed.
Abstract: Medical images have always been an important factor in diagnosis of disease. Poisson Noise in those images has always been a problem with the image clarity. We propose two technique which combines Multi-Scale Variance Stabilizing Transform (MS-VST), Fast Discrete Curvelet Transform (FDCT) with Thresholding and MS-VST, FDCT with Null Hypothesis testing for effectively removing the Poisson Noise from the medical images. The effectiveness of using these techniques has been analyzed using Peak Signal to Noise Ratio and Universal Image Quality Index.

3 citations


Cites methods from "Bilateral Filter Approach and Fast ..."

  • ...FDCT [1]-[2] is a second generation curvelet transform which is a multi resolution method....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: This paper describes two digital implementations of a new mathematical transform, namely, the second generation curvelet transform in two and three dimensions, based on unequally spaced fast Fourier transforms, while the second is based on the wrapping of specially selected Fourier samples.
Abstract: This paper describes two digital implementations of a new mathematical transform, namely, the second generation curvelet transform in two and three dimensions. The first digital transformation is based on unequally spaced fast Fourier transforms, while the second is based on the wrapping of specially selected Fourier samples. The two implementations essentially differ by the choice of spatial grid used to translate curvelets at each scale and angle. Both digital transformations return a table of digital curvelet coefficients indexed by a scale parameter, an orientation parameter, and a spatial location parameter. And both implementations are fast in the sense that they run in O(n^2 log n) flops for n by n Cartesian arrays; in addition, they are also invertible, with rapid inversion algorithms of about the same complexity. Our digital transformations improve upon earlier implementations—based upon the first generation of curvelets—in the sense that they are conceptually simpler, faster, and far less redundant. The software CurveLab, which implements both transforms presented in this paper, is available at http://www.curvelet.org.

2,603 citations


"Bilateral Filter Approach and Fast ..." refers methods in this paper

  • ...Fast discrete curvelet transforms [5] are used for optimal sparse representation of objects with discontinuities along C2 edges....

    [...]

Journal ArticleDOI

1,449 citations


"Bilateral Filter Approach and Fast ..." refers background in this paper

  • ...Anscombe transform is an example of variance stabilizing transform (VST) which transforms the Poisson domain into the Gaussian domain [2]....

    [...]

Journal ArticleDOI
TL;DR: A variance stabilizing transform (VST) is applied on a filtered discrete Poisson process, yielding a near Gaussian process with asymptotic constant variance, leading to multiscale VSTs (MS-VSTs) and nonlinear decomposition schemes.
Abstract: In order to denoise Poisson count data, we introduce a variance stabilizing transform (VST) applied on a filtered discrete Poisson process, yielding a near Gaussian process with asymptotic constant variance. This new transform, which can be deemed as an extension of the Anscombe transform to filtered data, is simple, fast, and efficient in (very) low-count situations. We combine this VST with the filter banks of wavelets, ridgelets and curvelets, leading to multiscale VSTs (MS-VSTs) and nonlinear decomposition schemes. By doing so, the noise-contaminated coefficients of these MS-VST-modified transforms are asymptotically normally distributed with known variances. A classical hypothesis-testing framework is adopted to detect the significant coefficients, and a sparsity-driven iterative scheme reconstructs properly the final estimate. A range of examples show the power of this MS-VST approach for recovering important structures of various morphologies in (very) low-count images. These results also demonstrate that the MS-VST approach is competitive relative to many existing denoising methods.

380 citations


"Bilateral Filter Approach and Fast ..." refers methods in this paper

  • ...To overcome this setback, we rely on Multi-scale VSTs (MS-VST) [3]....

    [...]

  • ...The VST of the image after passing through lowpass filter is given by [3]...

    [...]

Journal ArticleDOI
TL;DR: This paper proposes a switching bilateral filter with a texture and noise detector for universal noise removal that achieves high peak signal-to-noise ratio and great image quality by efficiently removing both types of mixed noise, salt-and-peppers with uniform noise and salt- and-pepper with Gaussian noise.
Abstract: In this paper, we propose a switching bilateral filter (SBF) with a texture and noise detector for universal noise removal. Operation was carried out in two stages: detection followed by filtering. For detection, we propose the sorted quadrant median vector (SQMV) scheme, which includes important features such as edge or texture information. This information is utilized to allocate a reference median from SQMV, which is in turn compared with a current pixel to classify it as impulse noise, Gaussian noise, or noise-free. The SBF removes both Gaussian and impulse noise without adding another weighting function. The range filter inside the bilateral filter switches between the Gaussian and impulse modes depending upon the noise classification result. Simulation results show that our noise detector has a high noise detection rate as well as a high classification rate for salt-and-pepper, uniform impulse noise and mixed impulse noise. Unlike most other impulse noise filters, the proposed SBF achieves high peak signal-to-noise ratio and great image quality by efficiently removing both types of mixed noise, salt-and-pepper with uniform noise and salt-and-pepper with Gaussian noise. In addition, the computational complexity of SBF is significantly less than that of other mixed noise filters.

141 citations


"Bilateral Filter Approach and Fast ..." refers background in this paper

  • ...The switching bilateral filter [4] can be used for removal of universal noise like impulse noise, salt and pepper noise and Gaussian noise, but is unable to deal with poisson noise....

    [...]

01 Nov 2007
TL;DR: Several multiscale approaches to photon-limited image reconstruction are reviewed, including wavelets combined with variance stabilizing transforms, corrected Haar wavelet transforms, multiplicative multiscales innovations, platelets, and the a trous wavelet transform.
Abstract: Many astronomical studies rely upon the accurate reconstruction of spatially distributed phenomena from photon-limited data These measurements are inherently “noisy” due to low photon counts In addition, the behavior of the underlying photon intensity functions can be very rich and complex, and consequently difficult to model a priori Nonparametric multiscale reconstruction methods overcome these challenges and facilitate characterization of fundamental performance limits In this paper, we review several multiscale approaches to photon-limited image reconstruction, including wavelets combined with variance stabilizing transforms, corrected Haar wavelet transforms, multiplicative multiscale innovations, platelets, and the a trous wavelet transform We discuss the performance of these methods in simulation studies, and detail statistical analyses of their performances 1 Photon-limited astronomy Many imaging modalities involve the detection of light or higher energy photons, and often the random nature of photon emission and detection is the dominant source of noise in imaging systems Such cases are referred to as photon-limited imaging applications, since the relatively small number of detected photons is the factor limiting the signal-to-noise ratio In many cases, the intensity underlying the photonlimited observations may be distorted by the point spread function or other physical properties of the imaging system Using these inherently noisy and distorted observations to perform quantitative inference on the underlying astrophysical phenomenon is a challenging problem affecting many researchers in the statistical and astronomical communities The data collected by these imaging systems are usually assumed to obey a spatial Poisson distribution involving a two-dimensional intensity image that describes the probability of photon emissions at different locations in space The mean and variance of a Poisson process are equal to the intensity The intensity/mean is the “signal” of interest and the variability of the data about the mean can be interpreted as “noise” Thus, as the intensity varies spatially as a function of astrophysical structure and function, so does the signal-to-noise ratio In this sense it could be said that the noise in photon-limited imaging is signal-dependent Many investigators have considered the use of wavelet representations for image denoising, deblurring, and other forms of image reconstruction because of the theoretical near-optimality and practical efficacy of wavelets in a variety of image processing contexts; for examples, see (Mallat 1998; Starck et al 1998; Aldroubi and Unser 1996) The procedure for classic wavelet denoising via hard thresholding is the following: compute the wavelet transform of the noisy image, set wavelet coefficients with magnitude less than some threshold to zero, and compute the inverse wavelet transform Wavelet denoising via soft thresholding is very similar, except each coefficient is either set to zero or shrunk depending upon its magnitude The basic idea behind these methods is that wavelet bases form a very parsimonious representation of many images of interest, so that the bulk of the image’s energy in concentrated in just a few wavelet coefficients, which as a result have very high magnitudes Most noise, however, does not share this property, and has its energy distributed relatively evenly among all the wavelet coefficients Thus, by

40 citations


"Bilateral Filter Approach and Fast ..." refers background in this paper

  • ...Usually, images are affected by poisson noise when photonic detection is used for image acquisition [1]....

    [...]