Lorentzian Iterative Hard Thresholding: Robust Compressed Sensing With Prior Information
Reads0
Chats0
TLDR
Simulation results demonstrate that the Lorentzian-based IHT algorithm significantly outperform commonly employed sparse reconstruction techniques in impulsive environments, while providing comparable performance in less demanding, light-tailed environments.Abstract:
Commonly employed reconstruction algorithms in compressed sensing (CS) use the L2 norm as the metric for the residual error. However, it is well-known that least squares (LS) based estimators are highly sensitive to outliers present in the measurement vector leading to a poor performance when the noise no longer follows the Gaussian assumption but, instead, is better characterized by heavier-than-Gaussian tailed distributions. In this paper, we propose a robust iterative hard Thresholding (IHT) algorithm for reconstructing sparse signals in the presence of impulsive noise. To address this problem, we use a Lorentzian cost function instead of the L2 cost function employed by the traditional IHT algorithm. We also modify the algorithm to incorporate prior signal information in the recovery process. Specifically, we study the case of CS with partially known support. The proposed algorithm is a fast method with computational load comparable to the LS based IHT, whilst having the advantage of robustness against heavy-tailed impulsive noise. Sufficient conditions for stability are studied and a reconstruction error bound is derived. We also derive sufficient conditions for stable sparse signal recovery with partially known support. Theoretical analysis shows that including prior support information relaxes the conditions for successful reconstruction. Simulation results demonstrate that the Lorentzian-based IHT algorithm significantly outperform commonly employed sparse reconstruction techniques in impulsive environments, while providing comparable performance in less demanding, light-tailed environments. Numerical results also demonstrate that the partially known support inclusion improves the performance of the proposed algorithm, thereby requiring fewer samples to yield an approximate reconstruction.read more
Citations
More filters
Journal ArticleDOI
Joint Channel Estimation and Data Rate Maximization for Intelligent Reflecting Surface Assisted Terahertz MIMO Communication Systems
TL;DR: A novel feedforward fully connected structure based deep neural network (DNN) scheme is put forward, which has the ability to learn how to output the optimal phase shift configurations by inputting the features of estimated channel.
Journal ArticleDOI
Robust Sparse Recovery in Impulsive Noise via $\ell _p$ -$\ell _1$ Optimization
TL;DR: A robust formulation for sparse recovery using the generalized ℓp-norm with 0 ≤ p <; 2 as the metric for the residual error is proposed and compared with some state-of-the-art robust algorithms via numerical simulations to show its improved performance in highly impulsive noise.
Journal ArticleDOI
Efficient and Robust Recovery of Sparse Signal and Image Using Generalized Nonconvex Regularization
TL;DR: In this article, the authors proposed a robust formulation for sparse reconstruction that employs the $\ell 1$ -norm as the loss function for the residual error and utilizes a generalized nonconvex penalty for sparsity inducing.
Posted Content
Efficient and Robust Recovery of Sparse Signal and Image Using Generalized Nonconvex Regularization
TL;DR: A robust formulation for sparse reconstruction that employs the inline-formula-norm as the loss function for the residual error and utilizes a generalized nonconvex penalty for sparsity inducing and a first-order algorithm based on alternating direction method of multipliers is proposed.
Journal ArticleDOI
Robust compressive sensing of sparse signals: a review
TL;DR: Robust nonlinear reconstruction strategies for sparse signals based on replacing the commonly used ℓ2 norm by M-estimators as data fidelity functions are overviewed, offering a robust framework for CS.
References
More filters
Journal ArticleDOI
An Introduction To Compressive Sampling
TL;DR: The theory of compressive sampling, also known as compressed sensing or CS, is surveyed, a novel sensing/sampling paradigm that goes against the common wisdom in data acquisition.
Journal ArticleDOI
Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
Joel A. Tropp,Anna C. Gilbert +1 more
TL;DR: It is demonstrated theoretically and empirically that a greedy algorithm called orthogonal matching pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal.
Signal Recovery from Random Measurements Via Orthogonal Matching Pursuit: The Gaussian Case
Joel A. Tropp,Anna C. Gilbert +1 more
TL;DR: In this paper, a greedy algorithm called Orthogonal Matching Pursuit (OMP) was proposed to recover a signal with m nonzero entries in dimension 1 given O(m n d) random linear measurements of that signal.
Journal ArticleDOI
An Iterative Thresholding Algorithm for Linear Inverse Problems with a Sparsity Constraint
TL;DR: It is proved that replacing the usual quadratic regularizing penalties by weighted 𝓁p‐penalized penalties on the coefficients of such expansions, with 1 ≤ p ≤ 2, still regularizes the problem.
Journal ArticleDOI
CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
Deanna Needell,Joel A. Tropp +1 more
TL;DR: A new iterative recovery algorithm called CoSaMP is described that delivers the same guarantees as the best optimization-based approaches and offers rigorous bounds on computational cost and storage.