scispace - formally typeset
Proceedings ArticleDOI

Stable sparse approximations via nonconvex optimization

Reads0
Chats0
TLDR
These results indicate that depending on the restricted isometry constants and the noise level, lscrp minimization with certain values of p < 1 provides better theoretical guarantees in terms of stability and robustness than lscR1 minimization does.
Abstract
We present theoretical results pertaining to the ability of lscrp minimization to recover sparse and compressible signals from incomplete and noisy measurements. In particular, we extend the results of Candes, Romberg and Tao (2005) to the p < 1 case. Our results indicate that depending on the restricted isometry constants (see, e.g., Candes and Tao (2006; 2005)) and the noise level, lscrp minimization with certain values of p < 1 provides better theoretical guarantees in terms of stability and robustness than lscr1 minimization does. This is especially true when the restricted isometry constants are relatively large.

read more

Citations
More filters
Journal ArticleDOI

Enhancing Sparsity by Reweighted ℓ 1 Minimization

TL;DR: A novel method for sparse signal recovery that in many situations outperforms ℓ1 minimization in the sense that substantially fewer measurements are needed for exact recovery.
Journal ArticleDOI

Iteratively reweighted least squares minimization for sparse recovery

TL;DR: In this article, an alternative method of determining x, as the limit of an iteratively reweighted least squares (IRLS) algorithm was proposed, where the main step of this IRLS finds, for a given weight vector w, the element in ˆ � 1.y/ with smallest 2.w/-norm.
Journal ArticleDOI

A Survey of Sparse Representation: Algorithms and Applications

TL;DR: A comprehensive overview of sparse representation is provided and an experimentally comparative study of these sparse representation algorithms was presented, which could sufficiently reveal the potential nature of the sparse representation theory.
Journal ArticleDOI

Restricted isometry properties and nonconvex compressive sensing

TL;DR: This work generalizes the result of showed that lp minimization with 0 < p < 1 recovers sparse signals from fewer linear measurements than does l1 minimization to an lp variant of the restricted isometry property, and determines how many random, Gaussian measurements are sufficient for the condition to hold with high probability.
Proceedings ArticleDOI

Fast algorithms for nonconvex compressive sensing: MRI reconstruction from very few data

TL;DR: This work extends recent Fourier-based algorithms for convex optimization to the nonconvex setting, and obtains methods that combine the reconstruction abilities of previous non Convex approaches with the computational speed of state-of-the-art convex methods.
References
More filters
Book

Compressed sensing

TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Journal ArticleDOI

Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information

TL;DR: In this paper, the authors considered the model problem of reconstructing an object from incomplete frequency samples and showed that with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the lscr/sub 1/ minimization problem.
Journal ArticleDOI

Decoding by linear programming

TL;DR: F can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program) and numerical experiments suggest that this recovery procedure works unreasonably well; f is recovered exactly even in situations where a significant fraction of the output is corrupted.
Journal ArticleDOI

Stable signal recovery from incomplete and inaccurate measurements

TL;DR: In this paper, the authors considered the problem of recovering a vector x ∈ R^m from incomplete and contaminated observations y = Ax ∈ e + e, where e is an error term.
Journal ArticleDOI

Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?

TL;DR: If the objects of interest are sparse in a fixed basis or compressible, then it is possible to reconstruct f to within very high accuracy from a small number of random measurements by solving a simple linear program.
Related Papers (5)