Proceedings ArticleDOI
A simpler approach to weighted ℓ 1 minimization
Anilesh K. Krishnaswamy,Samet Oymak,Babak Hassibi +2 more
- pp 3621-3624
TLDR
The performance of weighted ℓ1 minimization over a non-uniform sparse signal model is analyzed by extending the “Gaussian width” analysis proposed in [1] by providing a heuristic for estimating the optimal weights.Abstract:
In this paper, we analyze the performance of weighted l 1 minimization over a non-uniform sparse signal model by extending the “Gaussian width” analysis proposed in [1]. Our results are consistent with those of [7] which are currently the best known ones. However, our methods are less computationally intensive and can be easily extended to signals which have more than two sparsity classes. Finally, we also provide a heuristic for estimating the optimal weights, building on a more general model presented in [11]. Our results reinforce the fact that weighted l 1 minimization is substantially better than regular l 1 minimization and provide an easy way to calculate the optimal weights.read more
Citations
More filters
Posted Content
Weighted $\ell_1$-Minimization for Sparse Recovery under Arbitrary Prior Information
TL;DR: This work studies the recovery conditions and the associated recovery guarantees of weighted $\ell_1$-minimization when arbitrarily many distinct weights are permitted, and includes numerical experiments that demonstrate the benefits of allowing non-uniform weights in the reconstruction procedure.
Proceedings ArticleDOI
Recovery threshold for optimal weight ℓ 1 minimization
TL;DR: This paper gives a simple closed form expression for the minimum number of measurements required for successful recovery when the optimal weights are chosen and shows that the number of measured is upper bounded by the sum of the minimum numbers needed had the authors measured the S1 and S2 components of the signal separately.
Journal ArticleDOI
Weighted ${\ell}_{{1}}$-minimization for sparse recovery under arbitrary prior information
Proceedings Article
Weighted Graph Clustering with Non-Uniform Uncertainties
TL;DR: This work proposes a clustering algorithm that is based on optimizing an appropriate weighted objective, where larger weights are given to observations with lower uncertainty, and derives a provably optimal weighting scheme that applies to any given weight and distribution for the uncertainty.
Proceedings ArticleDOI
Rewighted L1-minimization for sparse solutions to underdetermined linear systems
Zhengguang Xie,Jianping Hu +1 more
TL;DR: The numerical experiential results demonstrate that the new method outperforms in successful probabilities, compared with classical l1 -minimization and other iteratively reweighted l1-algorithms.
References
More filters
Journal ArticleDOI
Decoding by linear programming
Emmanuel J. Candès,Terence Tao +1 more
TL;DR: F can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program) and numerical experiments suggest that this recovery procedure works unreasonably well; f is recovered exactly even in situations where a significant fraction of the output is corrupted.
Posted Content
Decoding by Linear Programming
Emmanuel J. Candès,Terence Tao +1 more
TL;DR: In this paper, it was shown that under suitable conditions on the coding matrix, the input vector can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program).
Journal ArticleDOI
The Convex Geometry of Linear Inverse Problems
TL;DR: This paper provides a general framework to convert notions of simplicity into convex penalty functions, resulting in convex optimization solutions to linear, underdetermined inverse problems.
Journal ArticleDOI
On sparse reconstruction from Fourier and Gaussian measurements
Mark Rudelson,Roman Vershynin +1 more
TL;DR: This paper improves upon best‐known guarantees for exact reconstruction of a sparse signal f from a small universal sample of Fourier measurements by showing that there exists a set of frequencies Ω such that one can exactly reconstruct every r‐sparse signal f of length n from its frequencies in Ω, using the convex relaxation.
Book
Sparse nonnegative solution of underdetermined linear equations by linear programming
David L. Donoho,Jared Tanner +1 more
TL;DR: It is shown that outward k-neighborliness is equivalent to the statement that, whenever y = Ax has a non negative solution with at most k nonzeros, it is the nonnegative solution to y =Ax having minimal sum.
Related Papers (5)
Fast and Accurate Algorithms for Re-Weighted $\ell _{1}$ -Norm Minimization
M. Salman Asif,Justin Romberg +1 more