scispace - formally typeset
Search or ask a question

Showing papers on "Thresholding published in 2011"


Journal ArticleDOI
TL;DR: In this paper, a thresholding procedure that is adaptive to the variability of individual entries is proposed, which achieves the optimal rate of convergence over a large class of sparse covariance matrices under the spectral norm.
Abstract: In this article we consider estimation of sparse covariance matrices and propose a thresholding procedure that is adaptive to the variability of individual entries. The estimators are fully data-driven and demonstrate excellent performance both theoretically and numerically. It is shown that the estimators adaptively achieve the optimal rate of convergence over a large class of sparse covariance matrices under the spectral norm. In contrast, the commonly used universal thresholding estimators are shown to be suboptimal over the same parameter spaces. Support recovery is discussed as well. The adaptive thresholding estimators are easy to implement. The numerical performance of the estimators is studied using both simulated and real data. Simulation results demonstrate that the adaptive thresholding estimators uniformly outperform the universal thresholding estimators. The method is also illustrated in an analysis on a dataset from a small round blue-cell tumor microarray experiment. A supplement to this ar...

583 citations


Journal ArticleDOI
Simon Foucart1
TL;DR: A new iterative algorithm to find sparse solutions of underdetermined linear systems is introduced and it is shown that, under a certain condition on the restricted isometry constant of the matrix of the linear system, the Hard Thresholding Pursuit algorithm indeed finds all $s$-sparse solutions.
Abstract: We introduce a new iterative algorithm to find sparse solutions of underdetermined linear systems. The algorithm, a simple combination of the Iterative Hard Thresholding algorithm and the Compressive Sampling Matching Pursuit algorithm, is called Hard Thresholding Pursuit. We study its general convergence and notice in particular that only a finite number of iterations are required. We then show that, under a certain condition on the restricted isometry constant of the matrix of the linear system, the Hard Thresholding Pursuit algorithm indeed finds all $s$-sparse solutions. This condition, which reads $\delta_{3 s} < 1/\sqrt{3}$, is heuristically better than the sufficient conditions currently available for other compressive sensing algorithms. It applies to fast versions of the algorithm, too, including the Iterative Hard Thresholding algorithm. Stability with respect to sparsity defect and robustness with respect to measurement error are also guaranteed under the condition $\delta_{3 s} < 1/\sqrt{3}$. We conclude with some numerical experiments to demonstrate the good empirical performance and the low complexity of the Hard Thresholding Pursuit algorithm.

513 citations


Posted Content
TL;DR: In this article, the Principal Orthogonal Factorization Thresholding (POET) method was introduced to explore an approximate factor structure with sparsity, which includes the sample covariance matrix, the factor-based covariance matrices, the thresholding estimator, and their factor loadings.
Abstract: This paper deals with the estimation of a high-dimensional covariance with a conditional sparsity structure and fast-diverging eigenvalues. By assuming sparse error covariance matrix in an approximate factor model, we allow for the presence of some cross-sectional correlation even after taking out common but unobservable factors. We introduce the Principal Orthogonal complEment Thresholding (POET) method to explore such an approximate factor structure with sparsity. The POET estimator includes the sample covariance matrix, the factor-based covariance matrix (Fan, Fan, and Lv, 2008), the thresholding estimator (Bickel and Levina, 2008) and the adaptive thresholding estimator (Cai and Liu, 2011) as specific examples. We provide mathematical insights when the factor analysis is approximately the same as the principal component analysis for high-dimensional data. The rates of convergence of the sparse residual covariance matrix and the conditional sparse covariance matrix are studied under various norms. It is shown that the impact of estimating the unknown factors vanishes as the dimensionality increases. The uniform rates of convergence for the unobserved factors and their factor loadings are derived. The asymptotic results are also verified by extensive simulation studies. Finally, a real data application on portfolio allocation is presented.

465 citations


Posted Content
TL;DR: In this paper, the authors consider the case of 1-bit CS measurements and provide a lower bound on the best achievable reconstruction error, and show that the same class of matrices that provide almost optimal noiseless performance also enable a robust mapping.
Abstract: The Compressive Sensing (CS) framework aims to ease the burden on analog-to-digital converters (ADCs) by reducing the sampling rate required to acquire and stably recover sparse signals. Practical ADCs not only sample but also quantize each measurement to a finite number of bits; moreover, there is an inverse relationship between the achievable sampling rate and the bit depth. In this paper, we investigate an alternative CS approach that shifts the emphasis from the sampling rate to the number of bits per measurement. In particular, we explore the extreme case of 1-bit CS measurements, which capture just their sign. Our results come in two flavors. First, we consider ideal reconstruction from noiseless 1-bit measurements and provide a lower bound on the best achievable reconstruction error. We also demonstrate that i.i.d. random Gaussian matrices describe measurement mappings achieving, with overwhelming probability, nearly optimal error decay. Next, we consider reconstruction robustness to measurement errors and noise and introduce the Binary $\epsilon$-Stable Embedding (B$\epsilon$SE) property, which characterizes the robustness measurement process to sign changes. We show the same class of matrices that provide almost optimal noiseless performance also enable such a robust mapping. On the practical side, we introduce the Binary Iterative Hard Thresholding (BIHT) algorithm for signal reconstruction from 1-bit measurements that offers state-of-the-art performance.

461 citations


Journal ArticleDOI
TL;DR: The denoising process is expressed as a linear expansion of thresholds (LET) that is optimized by relying on a purely data-adaptive unbiased estimate of the mean-squared error (MSE) derived in a non-Bayesian framework (PURE: Poisson-Gaussian unbiased risk estimate).
Abstract: We propose a general methodology (PURE-LET) to design and optimize a wide class of transform-domain thresholding algorithms for denoising images corrupted by mixed Poisson-Gaussian noise. We express the denoising process as a linear expansion of thresholds (LET) that we optimize by relying on a purely data-adaptive unbiased estimate of the mean-squared error (MSE), derived in a non-Bayesian framework (PURE: Poisson-Gaussian unbiased risk estimate). We provide a practical approximation of this theoretical MSE estimate for the tractable optimization of arbitrary transform-domain thresholding. We then propose a pointwise estimator for undecimated filterbank transforms, which consists of subband-adaptive thresholding functions with signal-dependent thresholds that are globally optimized in the image domain. We finally demonstrate the potential of the proposed approach through extensive comparisons with state-of-the-art techniques that are specifically tailored to the estimation of Poisson intensities. We also present denoising results obtained on real images of low-count fluorescence microscopy.

434 citations


Proceedings ArticleDOI
20 Jun 2011
TL;DR: This work uses an augmented Lagrangian optimization framework, which requires a combination of the proposed polynomial thresholding operator with the more traditional shrinkage-thresholding operator, to solve the problem of fitting one or more subspace to a collection of data points drawn from the subspaces and corrupted by noise/outliers.
Abstract: We consider the problem of fitting one or more subspaces to a collection of data points drawn from the subspaces and corrupted by noise/outliers. We pose this problem as a rank minimization problem, where the goal is to decompose the corrupted data matrix as the sum of a clean, self-expressive, low-rank dictionary plus a matrix of noise/outliers. Our key contribution is to show that, for noisy data, this non-convex problem can be solved very efficiently and in closed form from the SVD of the noisy data matrix. Remarkably, this is true for both one or more subspaces. An important difference with respect to existing methods is that our framework results in a polynomial thresholding of the singular values with minimal shrinkage. Indeed, a particular case of our framework in the case of a single subspace leads to classical PCA, which requires no shrinkage. In the case of multiple subspaces, our framework provides an affinity matrix that can be used to cluster the data according to the sub-spaces. In the case of data corrupted by outliers, a closed-form solution appears elusive. We thus use an augmented Lagrangian optimization framework, which requires a combination of our proposed polynomial thresholding operator with the more traditional shrinkage-thresholding operator.

350 citations


Journal ArticleDOI
TL;DR: Experimental results on a known database and achieving to more than 94% accuracy in about 50 s for blood vessel detection, proved that the blood vessels can be effectively detected by applying this method on the retinal images.
Abstract: Retinal images can be used in several applications, such as ocular fundus operations as well as human recognition. Also, they play important roles in detection of some diseases in early stages, such as diabetes, which can be performed by comparison of the states of retinal blood vessels. Intrinsic characteristics of retinal images make the blood vessel detection process difficult. Here, we proposed a new algorithm to detect the retinal blood vessels effectively. Due to the high ability of the curvelet transform in representing the edges, modification of curvelet transform coefficients to enhance the retinal image edges better prepares the image for the segmentation part. The directionality feature of the multistructure elements method makes it an effective tool in edge detection. Hence, morphology operators using multistructure elements are applied to the enhanced image in order to find the retinal image ridges. Afterward, morphological operators by reconstruction eliminate the ridges not belonging to the vessel tree while trying to preserve the thin vessels unchanged. In order to increase the efficiency of the morphological operators by reconstruction, they were applied using multistructure elements. A simple thresholding method along with connected components analysis (CCA) indicates the remained ridges belonging to vessels. In order to utilize CCA more efficiently, we locally applied the CCA and length filtering instead of considering the whole image. Experimental results on a known database, DRIVE, and achieving to more than 94% accuracy in about 50 s for blood vessel detection, proved that the blood vessels can be effectively detected by applying our method on the retinal images.

315 citations


Journal ArticleDOI
TL;DR: An encouraging result indicates that the proposed method may indeed outperform all manual approaches if no training data is available and the parameters associated with these methods are determined in a non-optimal way.
Abstract: This paper aims at contributing to the elaboration of new concepts for an efficient and standardized Synthetic Aperture Radar (SAR) based monitoring of floods. Algorithms that enable an automatic delineation of flooded areas are an essential component of any SAR-based monitoring service but are to date quasi non-existent. Here we propose a hybrid methodology, which combines radiometric thresholding and region growing as an approach enabling the automatic, objective and reliable flood extent extraction from SAR images. The method relies on the calibration of a statistical distribution of ‘open water’ backscatter values inferred from SAR images of floods. A radiometric thresholding provides the seed region for a subsequent region growing process. Change detection is included as an additional step that limits over-detection of inundated areas. Two variants of the proposed flood extraction algorithm (with and without integration of reference images) are tested against four state-of-the-art benchmark methods. The methods are evaluated through two case studies: the July 2007 flood of the Severn river (UK) and the February 1997 flood of the Red river (US). Our trial cases show that considering a reference pre- or post-flood image gives the same performance as optimized manual approaches. This encouraging result indicates that the proposed method may indeed outperform all manual approaches if no training data are available and the parameters associated with these methods are determined in a non-optimal way. The results further demonstrate the algorithm’s potential for accurately processing data from different SAR sensors.

313 citations


Journal ArticleDOI
TL;DR: The image analysis algorithm permits quantitative HRTEM, here specifically addressing nanostructure of carbonaceous materials.

276 citations


Proceedings ArticleDOI
24 Mar 2011
TL;DR: A new method for thresholding of photomicrographs of diversly stained cytology smears to account for the different stains is proposed and a new local thresholding scheme is developed to solve the problem of nonuniform staining.
Abstract: Accurate cell nucleus segmentation is necessary for automated cytological image analysis. Thresholding is a crucial step in segmentation. The accuracy of segmentation depends on the accuracy of thresholding. In this paper we propose a new method for thresholding of photomicrographs of diversly stained cytology smears. To account for the different stains, we use different color spaces. A new local thresholding scheme is developed to solve the problem of nonuniform staining. Finally, the results obtained from the new method are compared with those of some of the existing thresholding methods, clearly showing the improvement achieved.

256 citations


Journal ArticleDOI
TL;DR: This paper presents an approach to select objectively parameters for a region growing segmentation technique to outline landslides as individual segments and also addresses the scale dependence of landslides and false positives occurring in a natural landscape.
Abstract: To detect landslides by object-based image analysis using criteria based on shape, color, texture, and, in particular, contextual information and process knowledge, candidate segments must be delineated properly. This has proved challenging in the past, since segments are mainly created using spectral and size criteria that are not consistent for landslides. This paper presents an approach to select objectively parameters for a region growing segmentation technique to outline landslides as individual segments and also addresses the scale dependence of landslides and false positives occurring in a natural landscape. Multiple scale parameters were determined using a plateau objective function derived from the spatial autocorrelation and intrasegment variance analysis, allowing for differently sized features to be identified. While a high-resolution Resourcesat-1 Linear Imaging and Self Scanning Sensor IV (5.8 m) multispectral image was used to create segments for landslide recognition, terrain curvature derived from a digital terrain model based on Cartosat-1 (2.5 m) data was used to create segments for subsequent landslide classification. Here, optimal segments were used in a knowledge-based classification approach with the thresholds of diagnostic parameters derived from If-means cluster analysis, to detect landslides of five different types, with an overall recognition accuracy of 76.9%. The approach, when tested in a geomorphologically dissimilar area, recognized landslides with an overall accuracy of 77.7%, without modification to the methodology. The multiscale classification-based segment optimization procedure was also able to reduce the error of commission significantly in comparison to a single-optimal-scale approach.

Journal ArticleDOI
TL;DR: The experimental results demonstrate that the proposed maximum entropy based artificial bee colony thresholding (MEABCT) algorithm can search for multiple thresholds which are very close to the optimal ones examined by the exhaustive search method.
Abstract: Multilevel thresholding is an important technique for image processing and pattern recognition. The maximum entropy thresholding (MET) has been widely applied in the literature. In this paper, a new multilevel MET algorithm based on the technology of the artificial bee colony (ABC) algorithm is proposed: the maximum entropy based artificial bee colony thresholding (MEABCT) method. Four different methods are compared to this proposed method: the particle swarm optimization (PSO), the hybrid cooperative-comprehensive learning based PSO algorithm (HCOCLPSO), the Fast Otsu’s method and the honey bee mating optimization (HBMO). The experimental results demonstrate that the proposed MEABCT algorithm can search for multiple thresholds which are very close to the optimal ones examined by the exhaustive search method. Compared to the other four thresholding methods, the segmentation results of using the MEABCT algorithm is the most, however, the computation time by using the MEABCT algorithm is shorter than that of the other four methods.

Journal ArticleDOI
TL;DR: In this paper, a thresholding based iterative procedure for outlier detection (Θ-IPOD) was proposed to identify outliers and estimate regression coefficients, which is based on hard thresholding and soft thresholding.
Abstract: This article studies the outlier detection problem from the standpoint of penalized regression. In the regression model, we add one mean shift parameter for each of the n data points. We then apply a regularization favoring a sparse vector of mean shift parameters. The usual L1 penalty yields a convex criterion, but fails to deliver a robust estimator. The L1 penalty corresponds to soft thresholding. We introduce a thresholding (denoted by Θ) based iterative procedure for outlier detection (Θ–IPOD). A version based on hard thresholding correctly identifies outliers on some hard test problems. We describe the connection between Θ–IPOD and M-estimators. Our proposed method has one tuning parameter with which to both identify outliers and estimate regression coefficients. A data-dependent choice can be made based on the Bayes information criterion. The tuned Θ–IPOD shows outstanding performance in identifying outliers in various situations compared with other existing approaches. In addition, Θ–IPOD is much ...

Journal ArticleDOI
13 Apr 2011-Entropy
TL;DR: A global multi-level thresholding method for image segmentation using the Tsallis entropy as a general information theory entropy formalism and the artificial bee colony approach, which is more rapid than either genetic algorithm or particle swarm optimization.
Abstract: This paper proposes a global multi-level thresholding method for image segmentation. As a criterion for this, the traditional method uses the Shannon entropy, originated from information theory, considering the gray level image histogram as a probability distribution, while we applied the Tsallis entropy as a general information theory entropy formalism. For the algorithm, we used the artificial bee colony approach since execution of an exhaustive algorithm would be too time-consuming. The experiments demonstrate that: 1) the Tsallis entropy is superior to traditional maximum entropy thresholding, maximum between class variance thresholding, and minimum cross entropy thresholding; 2) the artificial bee colony is more rapid than either genetic algorithm or particle swarm optimization. Therefore, our approach is effective and rapid.

Journal ArticleDOI
TL;DR: Experimental results have demonstrated that the low complexity of the proposed HTFCM approach could obtain better cluster quality and segmentation results than other segmentation approaches that employing ant colony algorithm.

Journal ArticleDOI
01 Mar 2011
TL;DR: The proposed robust automatic crack-detection method from noisy concrete surface images includes two preprocessing steps and two detection steps, and probabilistic relaxation is used to detect cracks coarsely and to prevent noises.
Abstract: In maintenance of concrete structures, crack detection is important for the inspection and diagnosis of concrete structures. However, it is difficult to detect cracks automatically. In this paper, we propose a robust automatic crack-detection method from noisy concrete surface images. The proposed method includes two preprocessing steps and two detection steps. The first preprocessing step is a subtraction process using the median filter to remove slight variations like shadings from concrete surface images; only an original image is used in the preprocessing. In the second preprocessing step, a multi-scale line filter with the Hessian matrix is used both to emphasize cracks against blebs or stains and to adapt the width variation of cracks. After the preprocessing, probabilistic relaxation is used to detect cracks coarsely and to prevent noises. It is unnecessary to optimize any parameters in probabilistic relaxation. Finally, using the results from the relaxation process, a locally adaptive thresholding is performed to detect cracks more finely. We evaluate robustness and accuracy of the proposed method quantitatively using 60 actual noisy concrete surface images.

Journal ArticleDOI
TL;DR: A synthetic aperture radar (SAR) automatic target recognition approach based on a global scattering center model that is much easier to implement and less sensitive to nonideal factors such as noise and pose estimation error than point-to-point matching is proposed.
Abstract: This paper proposes a synthetic aperture radar (SAR) automatic target recognition approach based on a global scattering center model. The scattering center model is established offline using range profiles at multiple viewing angles, so the original data amount is much less than that required for establishing SAR image templates. Scattering center features at different target poses can be conveniently predicted by this model. Moreover, the model can be modified to predict features for various target configurations. For the SAR image to be classified, regional features in different levels are extracted by thresholding and morphological operations. The regional features will be matched to the predicted scattering center features of different targets to arrive at a decision. This region-to-point matching is much easier to implement and is less sensitive to nonideal factors such as noise and pose estimation error than point-to-point matching. A matching scheme going through from coarse to fine regional features in the inner cycle and going through different pose hypotheses in the outer cycle is designed to improve the efficiency and robustness of the classifier. Experiments using both data predicted by a high-frequency electromagnetic (EM) code and data measured in the MSTAR program verify the validity of the method.

Journal ArticleDOI
TL;DR: An effective cloud detection approach, the Hybrid Thresholding Algorithm (HYTA) that fully exploits the benefits of the combination of fixed and adaptive thresholding methods is put forward.
Abstract: Cloud detection is the precondition for deriving other information (e.g., cloud cover) in ground-based sky imager applications. This paper puts forward an effective cloud detection approach, the Hybrid Thresholding Algorithm (HYTA) that fully exploits the benefits of the combination of fixed and adaptive thresholding methods. First, HYTA transforms an input color cloud image into a normalized blue/red channel ratio image that can keep a distinct contrast, even with noise and outliers. Then, HYTA identifies the ratio image as either unimodal or bimodal according to its standard deviation, and the unimodal and bimodal images are handled by fixed and minimum cross entropy (MCE) thresholding algorithms, respectively. The experimental results demonstrate that HYTA shows an accuracy of 88.53%, which is far higher than those of either fixed or MCE thresholding alone. Moreover, HYTA is also verified to outperform other state-of-the-art cloud detection approaches.

Journal ArticleDOI
TL;DR: The experimental results show that the proposed FF-based MCET algorithm can efficiently search for multiple thresholds which are very close to the optimal ones examined by the exhaustive search method when the number of thresholds is less than 5.
Abstract: The minimum cross entropy thresholding (MCET) has been widely applied in image thresholding. The search mechanism of firefly algorithm inspired by the social behavior of the swarms of firefly and the phenomenon of bioluminescent communication, is used to search for multilevel thresholds for image segmentation in this paper. This new multilevel thresholding algorithm is called the firefly-based minimum cross entropy thresholding (FF-based MCET) algorithm. Four different methods that are the exhaustive search, the particle swarm optimization (PSO), the quantum particle swarm optimization (QPSO) and honey bee mating optimization (HBMO) methods are implemented for comparison with the results of the proposed method. The experimental results show that the proposed FF-based MCET algorithm can efficiently search for multiple thresholds which are very close to the optimal ones examined by the exhaustive search method when the number of thresholds is less than 5. The need of computation time of using the FF-based MCET algorithm is the least, meanwhile, the results using the FF-based MCET algorithm is superior to the ones of PSO-based and QPSO-based MCET algorithms but is not significantly different to the HBMO-based MCET algorithm.

Journal ArticleDOI
TL;DR: An effective traffic surveillance system for detecting and tracking moving vehicles in nighttime traffic scenes using image segmentation and pattern analysis techniques based on automatic multilevel histogram thresholding.
Abstract: This paper presents an effective traffic surveillance system for detecting and tracking moving vehicles in nighttime traffic scenes. The proposed method identifies vehicles by detecting and locating vehicle headlights and taillights using image segmentation and pattern analysis techniques. First, a fast bright-object segmentation process based on automatic multilevel histogram thresholding is applied to effectively extract bright objects of interest. This automatic multilevel thresholding approach provides a robust and adaptable detection system that operates well under various nighttime illumination conditions. The extracted bright objects are then processed by a spatial clustering and tracking procedure that locates and analyzes the spatial and temporal features of vehicle light patterns, and identifies and classifies moving cars and motorbikes in traffic scenes. The proposed real-time vision system has also been implemented and evaluated on a TI DM642 DSP-based embedded platform. The system is set up on elevated platforms to perform traffic surveillance on real highways and urban roads. Experimental results demonstrate that the proposed traffic surveillance approach is feasible and effective for vehicle detection and identification in various nighttime environments.

Journal ArticleDOI
TL;DR: A bacterial foraging (BF) algorithm based multilevel thresholding is presented, which is used to find the optimal threshold values for maximizing the Kapur's and Otsu's objective functions.
Abstract: The conventional multilevel thresholding methods are efficient for bi-level thresholding. However, they are computationally expensive extending to multilevel thresholding since they exhaustively search the optimal thresholds to optimize the objective functions. To overcome this drawback, a bacterial foraging (BF) algorithm based multilevel thresholding is presented in this paper. The BF algorithm is used to find the optimal threshold values for maximizing the Kapur's and Otsu's objective functions. The feasibility of the proposed BF technique has been tested on ten standard test images and benchmarked with particle swarm optimization algorithm (PSO) and genetic algorithm (GA). Experimental results of both qualitative and quantitative comparative studies for several existing methods illustrate the effectiveness and robustness of the proposed algorithm.

Journal ArticleDOI
TL;DR: The findings affirmed the robustness, fast convergence and proficiency of the proposed MBF over other existing techniques, and the Otsu based optimization method converges quickly as compared with Kapur's method.

Posted Content
TL;DR: In this article, the authors proposed an algorithm based on band exclusion (BE) and local optimization (LO) to deal with high coherent sensing matrices in discretization of continuum imaging problems such as radar and medical imaging.
Abstract: Highly coherent sensing matrices arise in discretization of continuum imaging problems such as radar and medical imaging when the grid spacing is below the Rayleigh threshold. Algorithms based on techniques of band exclusion (BE) and local optimization (LO) are proposed to deal with such coherent sensing matrices. These techniques are embedded in the existing compressed sensing algorithms such as Orthogonal Matching Pursuit (OMP), Subspace Pursuit (SP), Iterative Hard Thresholding (IHT), Basis Pursuit (BP) and Lasso, and result in the modified algorithms BLOOMP, BLOSP, BLOIHT, BP-BLOT and Lasso-BLOT, respectively. Under appropriate conditions, it is proved that BLOOMP can reconstruct sparse, widely separated objects up to one Rayleigh length in the Bottleneck distance {\em independent} of the grid spacing. One of the most distinguishing attributes of BLOOMP is its capability of dealing with large dynamic ranges. The BLO-based algorithms are systematically tested with respect to four performance metrics: dynamic range, noise stability, sparsity and resolution. With respect to dynamic range and noise stability, BLOOMP is the best performer. With respect to sparsity, BLOOMP is the best performer for high dynamic range while for dynamic range near unity BP-BLOT and Lasso-BLOT with the optimized regularization parameter have the best performance. In the noiseless case, BP-BLOT has the highest resolving power up to certain dynamic range. The algorithms BLOSP and BLOIHT are good alternatives to BLOOMP and BP/Lasso-BLOT: they are faster than both BLOOMP and BP/Lasso-BLOT and shares, to a lesser degree, BLOOMP's amazing attribute with respect to dynamic range. Detailed comparisons with existing algorithms such as Spectral Iterative Hard Thresholding (SIHT) and the frame-adapted BP are given.

Journal ArticleDOI
TL;DR: An automatic border detection method which determines the optimal color channels and performs hybrid thresholding to detect the lesion borders and is shown to be highly competitive with three state-of-the-art border detection methods and potentially faster, since it mainly involves scalar processing as opposed to vector processing performed in the other methods.

Journal ArticleDOI
TL;DR: An improved image reconstruction method from undersampled k‐space data, low‐dimensional‐structure self‐learning and thresholding (LOST), which utilizes the structure from the underlying image to improve image reconstruction for accelerated coronary MRI acquisitions is presented.
Abstract: An improved image reconstruction method from undersampled k-space data, low-dimensional-structure self-learning and thresholding (LOST), which utilizes the structure from the underlying image is presented. A low-resolution image from the fully sampled k-space center is reconstructed to learn image patches of similar anatomical characteristics. These patches are arranged into "similarity clusters," which are subsequently processed for dealiasing and artifact removal, using underlying low-dimensional properties. The efficacy of the proposed method in scan time reduction was assessed in a pilot coronary MRI study. Initially, in a retrospective study on 10 healthy adult subjects, we evaluated retrospective undersampling and reconstruction using LOST, wavelet-based l(1)-norm minimization, and total variation compressed sensing. Quantitative measures of vessel sharpness and mean square error, and qualitative image scores were used to compare reconstruction for rates of 2, 3, and 4. Subsequently, in a prospective study, coronary MRI data were acquired using these rates, and LOST-reconstructed images were compared with an accelerated data acquisition using uniform undersampling and sensitivity encoding reconstruction. Subjective image quality and sharpness data indicate that LOST outperforms the alternative techniques for all rates. The prospective LOST yields images with superior quality compared with sensitivity encoding or l(1)-minimization compressed sensing. The proposed LOST technique greatly improves image reconstruction for accelerated coronary MRI acquisitions.

Journal ArticleDOI
TL;DR: An explicit algorithm for the minimization of an l1-penalized least-squares functional, with non-separable l1 term, is proposed and a 1/N convergence rate is derived for the functional.
Abstract: An explicit algorithm for the minimization of an l1-penalized least-squares functional, with non-separable l1 term, is proposed. Each step in the iterative algorithm requires four matrix vector multiplications and a single simple projection on a convex set (or equivalently thresholding). Convergence is proven and a 1/N convergence rate is derived for the functional. In the special case where the matrix in the l1 term is the identity (or orthogonal), the algorithm reduces to the traditional iterative soft-thresholding algorithm. In the special case where the matrix in the quadratic term is the identity (or orthogonal), the algorithm reduces to a gradient projection algorithm for the dual problem. By replacing the projection with a simple proximity operator, other convex non-separable penalties than those based on an l1-norm can be handled as well.

Journal ArticleDOI
TL;DR: A new variant of Particle Swarm Optimization (PSO) for image segmentation using optimal multi-level thresholding and makes a new contribution in adapting 'social' and 'momentum' components of the velocity equation for particle move updates.
Abstract: Research highlights? Our algorithm is a new variant of PSO. ? It performs better than any existing variants of PSO. ? It performs better than good optimization algorithms, like Gaussian smoothing, etc. ? It was successfully applied to image segmentation problem. In this paper, we present a new variant of Particle Swarm Optimization (PSO) for image segmentation using optimal multi-level thresholding. Some objective functions which are very efficient for bi-level thresholding purpose are not suitable for multi-level thresholding due to the exponential growth of computational complexity. The present paper also proposes an iterative scheme that is practically more suitable for obtaining initial values of candidate multilevel thresholds. This self iterative scheme is proposed to find the suitable number of thresholds that should be used to segment an image. This iterative scheme is based on the well known Otsu's method, which shows a linear growth of computational complexity. The thresholds resulting from the iterative scheme are taken as initial thresholds and the particles are created randomly around these thresholds, for the proposed PSO variant. The proposed PSO algorithm makes a new contribution in adapting 'social' and 'momentum' components of the velocity equation for particle move updates. The proposed segmentation method is employed for four benchmark images and the performances obtained outperform results obtained with well known methods, like Gaussian-smoothing method (Lim, Y. K., & Lee, S. U. (1990). On the color image segmentation algorithm based on the thresholding and the fuzzy c-means techniques. Pattern Recognition, 23, 935-952; Tsai, D. M. (1995). A fast thresholding selection procedure for multimodal and unimodal histograms. Pattern Recognition Letters, 16, 653-666), Symmetry-duality method (Yin, P. Y., & Chen, L. H. (1993). New method for multilevel thresholding using the symmetry and duality of the histogram. Journal of Electronics and Imaging, 2, 337-344), GA-based algorithm (Yin, P. -Y. (1999). A fast scheme for optimal thresholding using genetic algorithms. Signal Processing, 72, 85-95) and the basic PSO variant employing linearly decreasing inertia weight factor.

Proceedings Article
01 Aug 2011
TL;DR: Experimental results reveal that the proposed multiscale reconstruction preserves the fast computation associated with block-based compressed sensing while rivaling the reconstruction quality of a popular total-variation algorithm known for both its high-quality reconstruction as well as its exceedingly large computational cost.
Abstract: A multiscale variant of the block compressed sensing with smoothed projected Landweber reconstruction algorithm is proposed for the compressed sensing of images. In essence, block-based compressed-sensing sampling is deployed independently within each subband of each decomposition level of a wavelet transform of an image. The corresponding multiscale reconstruction interleaves Landweber steps on the individual blocks with a smoothing filter in the spatial domain of the image as well as thresholding within a sparsity transform. Experimental results reveal that the proposed multiscale reconstruction preserves the fast computation associated with block-based compressed sensing while rivaling the reconstruction quality of a popular total-variation algorithm known for both its high-quality reconstruction as well as its exceedingly large computational cost.

Journal ArticleDOI
TL;DR: The experiments that were performed on a bitemporal TerraSAR-X StripMap data set from South West England during and after a large-scale flooding in 2007 confirm the effectiveness of the proposed change detection method and show an increased classification accuracy of the hybrid MRF model in comparison to the sole application of the HMAP estimation.
Abstract: The near real-time provision of precise information about flood dynamics from synthetic aperture radar (SAR) data is an essential task in disaster management. A novel tile-based parametric thresholding approach under the generalized Gaussian assumption is applied on normalized change index data to automatically solve the three-class change detection problem in large-size images with small class a priori probabilities. The thresholding result is used for the initialization of a hybrid Markov model which integrates scale-dependent and spatiocontextual information into the labeling process by combining hierarchical with noncausal Markov image modeling. Hierarchical maximum a posteriori (HMAP) estimation using the Markov chains in scale, originally developed on quadtrees, is adapted to hierarchical irregular graphs. To reduce the computational effort of the iterative optimization process that is related to noncausal Markov models, a Markov random field (MRF) approach is defined, which is applied on a restricted region of the lowest level of the graph, selected according to the HMAP labeling result. The experiments that were performed on a bitemporal TerraSAR-X StripMap data set from South West England during and after a large-scale flooding in 2007 confirm the effectiveness of the proposed change detection method and show an increased classification accuracy of the hybrid MRF model in comparison to the sole application of the HMAP estimation. Additionally, the impact of the graph structure and the chosen model parameters on the labeling result as well as on the performance is discussed.

Journal ArticleDOI
TL;DR: This work presents a simple but effective denoising algorithm using a local DCT thresholding applied separately to each color channel after decorrelation that can be considered as a baseline for comparison and lower bound of performance for newly developed techniques.
Abstract: This work presents a simple but effective denoising algorithm using a local DCT thresholding. This thresholding is applied separately to each color channel after decorrelation. Due to its simplicity and excellent performance, this contribution can be considered as a baseline for comparison and lower bound of performance for newly developed techniques. Source Code The source code (ANSI C), its documentation, and the online demo are accessible at the IPOL web page of this article1.