Open Access
Signal Recovery from Random Measurements Via Orthogonal Matching Pursuit: The Gaussian Case
Joel A. Tropp,Anna C. Gilbert +1 more
Reads0
Chats0
TLDR
In this paper, a greedy algorithm called Orthogonal Matching Pursuit (OMP) was proposed to recover a signal with m nonzero entries in dimension 1 given O(m n d) random linear measurements of that signal.Abstract:
This report demonstrates theoretically and empirically that a greedy algorithm called
Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension
d given O(mln d) random linear measurements of that signal. This is a massive improvement
over previous results, which require O(m2) measurements. The new results for OMP are comparable
with recent results for another approach called Basis Pursuit (BP). In some settings, the
OMP algorithm is faster and easier to implement, so it is an attractive alternative to BP for signal
recovery problems.read more
Citations
More filters
Journal ArticleDOI
An Introduction To Compressive Sampling
TL;DR: The theory of compressive sampling, also known as compressed sensing or CS, is surveyed, a novel sensing/sampling paradigm that goes against the common wisdom in data acquisition.
Journal ArticleDOI
Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
Joel A. Tropp,Anna C. Gilbert +1 more
TL;DR: It is demonstrated theoretically and empirically that a greedy algorithm called orthogonal matching pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal.
Journal ArticleDOI
CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
Deanna Needell,Joel A. Tropp +1 more
TL;DR: A new iterative recovery algorithm called CoSaMP is described that delivers the same guarantees as the best optimization-based approaches and offers rigorous bounds on computational cost and storage.
Journal ArticleDOI
Spatially Sparse Precoding in Millimeter Wave MIMO Systems
TL;DR: This paper considers transmit precoding and receiver combining in mmWave systems with large antenna arrays and develops algorithms that accurately approximate optimal unconstrained precoders and combiners such that they can be implemented in low-cost RF hardware.
Journal ArticleDOI
CoSaMP: iterative signal recovery from incomplete and inaccurate samples
Deanna Needell,Joel A. Tropp +1 more
TL;DR: This extended abstract describes a recent algorithm, called, CoSaMP, that accomplishes the data recovery task and was the first known method to offer near-optimal guarantees on resource usage.
References
More filters
Journal ArticleDOI
Atomic Decomposition by Basis Pursuit
TL;DR: This work gives examples exhibiting several advantages over MOF, MP, and BOB, including better sparsity and superresolution, and obtains reasonable success with a primal-dual logarithmic barrier method and conjugate-gradient solver.
Journal ArticleDOI
An Iterative Thresholding Algorithm for Linear Inverse Problems with a Sparsity Constraint
TL;DR: It is proved that replacing the usual quadratic regularizing penalties by weighted 𝓁p‐penalized penalties on the coefficients of such expansions, with 1 ≤ p ≤ 2, still regularizes the problem.
Journal ArticleDOI
Greed is good: algorithmic results for sparse approximation
TL;DR: This article presents new results on using a greedy algorithm, orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries and develops a sufficient condition under which OMP can identify atoms from an optimal approximation of a nonsparse signal.
Book
Interior-Point Polynomial Algorithms in Convex Programming
TL;DR: This book describes the first unified theory of polynomial-time interior-point methods, and describes several of the new algorithms described, e.g., the projective method, which have been implemented, tested on "real world" problems, and found to be extremely efficient in practice.
Posted Content
An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
Abstract: We consider linear inverse problems where the solution is assumed to have a sparse expansion on an arbitrary pre-assigned orthonormal basis. We prove that replacing the usual quadratic regularizing penalties by weighted l^p-penalties on the coefficients of such expansions, with 1 < or = p < or =2, still regularizes the problem. If p < 2, regularized solutions of such l^p-penalized problems will have sparser expansions, with respect to the basis under consideration. To compute the corresponding regularized solutions we propose an iterative algorithm that amounts to a Landweber iteration with thresholding (or nonlinear shrinkage) applied at each iteration step. We prove that this algorithm converges in norm. We also review some potential applications of this method.