scispace - formally typeset
Open AccessPosted Content

Enhancing Sparsity by Reweighted L1 Minimization

TLDR
In this article, a weighted L1-minimization problem is solved by solving a sequence of weighted L 1 minimization problems, where the weights used for the next iteration are computed from the value of the current solution, and a series of experiments demonstrate the remarkable performance and broad applicability of this algorithm in the areas of sparse signal recovery, statistical estimation, error correction and image processing.
Abstract
It is now well understood that (1) it is possible to reconstruct sparse signals exactly from what appear to be highly incomplete sets of linear measurements and (2) that this can be done by constrained L1 minimization. In this paper, we study a novel method for sparse signal recovery that in many situations outperforms L1 minimization in the sense that substantially fewer measurements are needed for exact recovery. The algorithm consists of solving a sequence of weighted L1-minimization problems where the weights used for the next iteration are computed from the value of the current solution. We present a series of experiments demonstrating the remarkable performance and broad applicability of this algorithm in the areas of sparse signal recovery, statistical estimation, error correction and image processing. Interestingly, superior gains are also achieved when our method is applied to recover signals with assumed near-sparsity in overcomplete representations--not by reweighting the L1 norm of the coefficient sequence as is common, but by reweighting the L1 norm of the transformed object. An immediate consequence is the possibility of highly efficient data acquisition protocols by improving on a technique known as compressed sensing.

read more

Citations
More filters
Journal ArticleDOI

Non-parametric seismic data recovery with curvelet frames

TL;DR: A non-parametric transform-based recovery method is presented that exploits the compression of seismic data volumes by recently developed curvelet frames and performs well on synthetic as well as real data by virtue of the sparsifying property of curvelets.
Proceedings ArticleDOI

An efficient algorithm for compressed MR imaging using total variation and wavelets

TL;DR: This work proposes an efficient algorithm that jointly minimizes the lscr1 norm, total variation, and a least squares measure, one of the most powerful models for compressive MR imaging, based upon an iterative operator-splitting framework.
Journal ArticleDOI

Least angle and $\ell_1$ penalized regression: A review

TL;DR: This paper provides an explanation for the similar behavior of LASSO and forward stagewise regression, and provides a fast implementation of both.
Journal ArticleDOI

Sparse Signal Reconstruction via Iterative Support Detection

TL;DR: An efficient implementation of ISD is introduced, called threshold-ISD, for recovering signals with fast decaying distributions of nonzeros from compressive sensing measurements, as well as two state-of-the-art algorithms: the iterative reweighted $\ell_1$ minimization algorithm (IRL1) and the iteratives reweighting least-squares algorithm ( IRLS).
Journal ArticleDOI

Dictionary Learning for Sparse Approximations With the Majorization Method

TL;DR: A novel method for dictionary learning and extends the learning problem by introducing different constraints on the dictionary by using the majorization method, an optimization method that substitutes the original objective function with a surrogate function that is updated in each optimization step.
References
More filters
Journal ArticleDOI

Regression Shrinkage and Selection via the Lasso

TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Book

Convex Optimization

TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Book

Compressed sensing

TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Journal ArticleDOI

Nonlinear total variation based noise removal algorithms

TL;DR: In this article, a constrained optimization type of numerical algorithm for removing noise from images is presented, where the total variation of the image is minimized subject to constraints involving the statistics of the noise.
Journal ArticleDOI

Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information

TL;DR: In this paper, the authors considered the model problem of reconstructing an object from incomplete frequency samples and showed that with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the lscr/sub 1/ minimization problem.
Related Papers (5)