scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Adaptive Thresholding Method for Speckle Reduction of Echocardiographic Images

04 Jul 2019-Iete Journal of Research (Taylor & Francis)-pp 1-9
TL;DR: One of the most powerful and efficient tools for despeckling of images is wavelet transference, which is an inherent property in echocardiographic images.
Abstract: Speckle noise is a granular disturbance, which is an inherent property in echocardiographic images. One of the most powerful and efficient tools for despeckling of images is wavelet transform. A li...
Citations
More filters
Journal ArticleDOI
TL;DR: The proposed optimisation based algorithm using wavelet decomposition for segmentation of echocardiography images is observed to be performing better and found to have 94% accuracy compared to ground truth labels.
Abstract: We propose optimisation based algorithm using wavelet decomposition for segmentation of echocardiography images. The objective of the proposed method is to find an optimal threshold value for segmentation. This threshold value is used for separating the left ventricle by segmenting wavelet decomposed coefficients. The proposed method evaluates the optimal value of threshold using a nonlinear derivative free optimising algorithm. Its objective function is the contrast property of GLCM. The contrast would be highest at the correct position of the boundary between image regions. Accordingly, optimization task is formulated as a maximization problem. The proposed method is an iterative process for finding optimum threshold. Segmentation is carried out using this optimised threshold value. Further, the exact left ventricle contour is extracted using morphological operations and user provided location input. The proposed method is observed to be performing better and found to have 94% accuracy compared to ground truth labels.

2 citations


Cites methods from "Adaptive Thresholding Method for Sp..."

  • ...Echo images are first denoised using a method proposed by Kulkarni and Madathil.21 2....

    [...]

  • ...Echo images are first denoised using a method proposed by Kulkarni and Madathil.(21) 2....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: The authors prove two results about this type of estimator that are unprecedented in several ways: with high probability f/spl circ/*/sub n/ is at least as smooth as f, in any of a wide variety of smoothness measures.
Abstract: Donoho and Johnstone (1994) proposed a method for reconstructing an unknown function f on [0,1] from noisy data d/sub i/=f(t/sub i/)+/spl sigma/z/sub i/, i=0, ..., n-1,t/sub i/=i/n, where the z/sub i/ are independent and identically distributed standard Gaussian random variables. The reconstruction f/spl circ/*/sub n/ is defined in the wavelet domain by translating all the empirical wavelet coefficients of d toward 0 by an amount /spl sigma//spl middot//spl radic/(2log (n)/n). The authors prove two results about this type of estimator. [Smooth]: with high probability f/spl circ/*/sub n/ is at least as smooth as f, in any of a wide variety of smoothness measures. [Adapt]: the estimator comes nearly as close in mean square to f as any measurable estimator can come, uniformly over balls in each of two broad scales of smoothness classes. These two properties are unprecedented in several ways. The present proof of these results develops new facts about abstract statistical inference and its connection with an optimal recovery model. >

9,359 citations

Proceedings ArticleDOI
04 Jan 1998
TL;DR: In contrast with filters that operate on the three bands of a color image separately, a bilateral filter can enforce the perceptual metric underlying the CIE-Lab color space, and smooth colors and preserve edges in a way that is tuned to human perception.
Abstract: Bilateral filtering smooths images while preserving edges, by means of a nonlinear combination of nearby image values. The method is noniterative, local, and simple. It combines gray levels or colors based on both their geometric closeness and their photometric similarity, and prefers near values to distant values in both domain and range. In contrast with filters that operate on the three bands of a color image separately, a bilateral filter can enforce the perceptual metric underlying the CIE-Lab color space, and smooth colors and preserve edges in a way that is tuned to human perception. Also, in contrast with standard filtering, bilateral filtering produces no phantom colors along edges in color images, and reduces phantom colors where they appear in the original image.

8,738 citations

Journal ArticleDOI
TL;DR: In this article, the authors developed a spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coefficients, and achieved a performance within a factor log 2 n of the ideal performance of piecewise polynomial and variable-knot spline methods.
Abstract: SUMMARY With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline, or variable bandwidth kernel, to the unknown function. Estimation with the aid of an oracle offers dramatic advantages over traditional linear estimation by nonadaptive kernels; however, it is a priori unclear whether such performance can be obtained by a procedure relying on the data alone. We describe a new principle for spatially-adaptive estimation: selective wavelet reconstruction. We show that variable-knot spline fits and piecewise-polynomial fits, when equipped with an oracle to select the knots, are not dramatically more powerful than selective wavelet reconstruction with an oracle. We develop a practical spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coefficients. RiskShrink mimics the performance of an oracle for selective wavelet reconstruction as well as it is possible to do so. A new inequality in multivariate normal decision theory which we call the oracle inequality shows that attained performance differs from ideal performance by at most a factor of approximately 2 log n, where n is the sample size. Moreover no estimator can give a better guarantee than this. Within the class of spatially adaptive procedures, RiskShrink is essentially optimal. Relying only on the data, it comes within a factor log 2 n of the performance of piecewise polynomial and variableknot spline methods equipped with an oracle. In contrast, it is unknown how or if piecewise polynomial methods could be made to function this well when denied access to an oracle and forced to rely on data alone.

8,153 citations

Journal ArticleDOI
TL;DR: In this article, the authors proposed a smoothness adaptive thresholding procedure, called SureShrink, which is adaptive to the Stein unbiased estimate of risk (sure) for threshold estimates and is near minimax simultaneously over a whole interval of the Besov scale; the size of this interval depends on the choice of mother wavelet.
Abstract: We attempt to recover a function of unknown smoothness from noisy sampled data. We introduce a procedure, SureShrink, that suppresses noise by thresholding the empirical wavelet coefficients. The thresholding is adaptive: A threshold level is assigned to each dyadic resolution level by the principle of minimizing the Stein unbiased estimate of risk (Sure) for threshold estimates. The computational effort of the overall procedure is order N · log(N) as a function of the sample size N. SureShrink is smoothness adaptive: If the unknown function contains jumps, then the reconstruction (essentially) does also; if the unknown function has a smooth piece, then the reconstruction is (essentially) as smooth as the mother wavelet will allow. The procedure is in a sense optimally smoothness adaptive: It is near minimax simultaneously over a whole interval of the Besov scale; the size of this interval depends on the choice of mother wavelet. We know from a previous paper by the authors that traditional smoot...

4,699 citations

Journal ArticleDOI
TL;DR: A general mathematical and experimental methodology to compare and classify classical image denoising algorithms and a nonlocal means (NL-means) algorithm addressing the preservation of structure in a digital image are defined.
Abstract: The search for efficient image denoising methods is still a valid challenge at the crossing of functional analysis and statistics In spite of the sophistication of the recently proposed methods, m

4,153 citations