scispace - formally typeset
Open AccessJournal ArticleDOI

Reduce and Boost: Recovering Arbitrary Sets of Jointly Sparse Vectors

TLDR
To efficiently find the single sparse vector produced by the last reduction step, this paper suggests an empirical boosting strategy that improves the recovery ability of any given suboptimal method for recovering a sparse vector.
Abstract
The rapid developing area of compressed sensing suggests that a sparse vector lying in a high dimensional space can be accurately and efficiently recovered from only a small set of nonadaptive linear measurements, under appropriate conditions on the measurement matrix. The vector model has been extended both theoretically and practically to a finite set of sparse vectors sharing a common sparsity pattern. In this paper, we treat a broader framework in which the goal is to recover a possibly infinite set of jointly sparse vectors. Extending existing algorithms to this model is difficult due to the infinite structure of the sparse vector set. Instead, we prove that the entire infinite set of sparse vectors can be recovered by solving a single, reduced-size finite-dimensional problem, corresponding to recovery of a finite set of sparse vectors. We then show that the problem can be further reduced to the basic model of a single sparse vector by randomly combining the measurements. Our approach is exact for both countable and uncountable sets, as it does not rely on discretization or heuristic techniques. To efficiently find the single sparse vector produced by the last reduction step, we suggest an empirical boosting strategy that improves the recovery ability of any given suboptimal method for recovering a sparse vector. Numerical experiments on random data demonstrate that, when applied to infinite sets, our strategy outperforms discretization techniques in terms of both run time and empirical recovery rate. In the finite model, our boosting algorithm has fast run time and much higher recovery rate than known popular methods.

read more

Citations
More filters
Journal ArticleDOI

Block-Sparse Signals: Uncertainty Relations and Efficient Recovery

TL;DR: The significance of the results presented in this paper lies in the fact that making explicit use of block-sparsity can provably yield better reconstruction properties than treating the signal as being sparse in the conventional sense, thereby ignoring the additional structure in the problem.
Journal ArticleDOI

From Theory to Practice: Sub-Nyquist Sampling of Sparse Wideband Analog Signals

TL;DR: This paper considers the challenging problem of blind sub-Nyquist sampling of multiband signals, whose unknown frequency support occupies only a small portion of a wide spectrum, and proposes a system, named the modulated wideband converter, which first multiplies the analog signal by a bank of periodic waveforms.
Journal ArticleDOI

Structured Compressed Sensing: From Theory to Applications

TL;DR: The prime focus is bridging theory and practice, to pinpoint the potential of structured CS strategies to emerge from the math to the hardware in compressive sensing.
Journal ArticleDOI

Robust Recovery of Signals From a Structured Union of Subspaces

TL;DR: This paper develops a general framework for robust and efficient recovery of nonlinear but structured signal models, in which x lies in a union of subspaces, and presents an equivalence condition under which the proposed convex algorithm is guaranteed to recover the original signal.
Posted Content

Robust Recovery of Signals From a Structured Union of Subspaces

TL;DR: In this article, a general framework for robust and efficient recovery of such signals from a given set of samples is developed. But this framework does not consider the problem of reconstructing an unknown signal from a series of samples.
References
More filters
Book

Compressed sensing

TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Journal ArticleDOI

Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information

TL;DR: In this paper, the authors considered the model problem of reconstructing an object from incomplete frequency samples and showed that with probability at least 1-O(N/sup -M/), f can be reconstructed exactly as the solution to the lscr/sub 1/ minimization problem.
Journal ArticleDOI

Atomic Decomposition by Basis Pursuit

TL;DR: Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
Journal ArticleDOI

Matching pursuits with time-frequency dictionaries

TL;DR: The authors introduce an algorithm, called matching pursuit, that decomposes any signal into a linear expansion of waveforms that are selected from a redundant dictionary of functions, chosen in order to best match the signal structures.
Book

Probability and Measure

TL;DR: In this paper, the convergence of distributions is considered in the context of conditional probability, i.e., random variables and expected values, and the probability of a given distribution converging to a certain value.
Related Papers (5)