Open Access
Signal Recovery from Random Measurements Via Orthogonal Matching Pursuit: The Gaussian Case
Joel A. Tropp,Anna C. Gilbert +1 more
Reads0
Chats0
TLDR
In this paper, a greedy algorithm called Orthogonal Matching Pursuit (OMP) was proposed to recover a signal with m nonzero entries in dimension 1 given O(m n d) random linear measurements of that signal.Abstract:
This report demonstrates theoretically and empirically that a greedy algorithm called
Orthogonal Matching Pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension
d given O(mln d) random linear measurements of that signal. This is a massive improvement
over previous results, which require O(m2) measurements. The new results for OMP are comparable
with recent results for another approach called Basis Pursuit (BP). In some settings, the
OMP algorithm is faster and easier to implement, so it is an attractive alternative to BP for signal
recovery problems.read more
Citations
More filters
Posted Content
Greedy Signal Recovery Review
TL;DR: In this article, the authors presented a stable greedy algorithm Compressive Sampling Matching Pursuit (CoSaMP), which provides uniform guarantees and improves upon the stability bounds and RIC requirements of ROMP.
Journal ArticleDOI
Unsupervised Sparse Pattern Diagnostic of Defects With Inductive Thermography Imaging System
TL;DR: The proposed unsupervised method for diagnosing and monitoring defects in inductive thermography imaging system is fully automated and does not require manual selection from the user of the specific thermal frame images for defect diagnosis.
Journal ArticleDOI
Convolutional Sparse Coding for High Dynamic Range Imaging
TL;DR: In this article, a convolutional sparse coding (CSC) based method is proposed to recover high-quality HDRI images from a single coded exposure, which achieves higher quality reconstructions than alternative methods.
Journal ArticleDOI
Deterministic Sensing Matrices Arising From Near Orthogonal Systems
Shuxing Li,Gennian Ge +1 more
TL;DR: This paper introduces the concept of near orthogonal systems to characterize the matrices with low coherence, which lie in the heart of many different applications and conducts a lot of numerical experiments to show that some of the constructions are the best possible deterministic ones based on coherence.
DissertationDOI
Sparse convex optimization methods for machine learning
TL;DR: A convergence proof guaranteeing e-small error is given after O( 1e ) iterations, and the sparsity of approximate solutions for any `1-regularized convex optimization problem (and for optimization over the simplex), expressed as a function of the approximation quality.
References
More filters
Book
Compressed sensing
TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Journal ArticleDOI
Atomic Decomposition by Basis Pursuit
TL;DR: Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
Journal ArticleDOI
Matching pursuits with time-frequency dictionaries
Stéphane Mallat,Zhifeng Zhang +1 more
TL;DR: The authors introduce an algorithm, called matching pursuit, that decomposes any signal into a linear expansion of waveforms that are selected from a redundant dictionary of functions, chosen in order to best match the signal structures.
Journal ArticleDOI
Least angle regression
Bradley Efron,Trevor Hastie,Iain M. Johnstone,Robert Tibshirani,Hemant Ishwaran,Keith Knight,Jean-Michel Loubes,Jean-Michel Loubes,Pascal Massart,Pascal Massart,David Madigan,David Madigan,Greg Ridgeway,Greg Ridgeway,Saharon Rosset,Saharon Rosset,Ji Zhu,Robert A. Stine,Berwin A. Turlach,Sanford Weisberg +19 more
TL;DR: A publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates is described.