scispace - formally typeset
Open AccessJournal ArticleDOI

Linear convergence of iterative soft-thresholding

Reads0
Chats0
TLDR
A unified approach to iterative soft-thresholding algorithms for the solution of linear operator equations in infinite dimensional Hilbert spaces is presented and it is shown that the constants can be calculated explicitly in special cases.
Abstract
In this article a unified approach to iterative soft-thresholding algorithms for the solution of linear operator equations in infinite dimensional Hilbert spaces is presented. We formulate the algorithm in the framework of generalized gradient methods and present a new convergence analysis. As main result we show that the algorithm converges with linear rate as soon as the underlying operator satisfies the so-called finite basis injectivity property or the minimizer possesses a so-called strict sparsity pattern. Moreover it is shown that the constants can be calculated explicitly in special cases (i.e. for compact operators). Furthermore, the techniques also can be used to establish linear convergence for related methods such as the iterative thresholding algorithm for joint sparsity and the accelerated gradient projection method.

read more

Citations
More filters
Book

Proximal Algorithms

TL;DR: The many different interpretations of proximal operators and algorithms are discussed, their connections to many other topics in optimization and applied mathematics are described, some popular algorithms are surveyed, and a large number of examples of proxiesimal operators that commonly arise in practice are provided.
Journal ArticleDOI

Message-passing algorithms for compressed sensing

TL;DR: A simple costless modification to iterative thresholding is introduced making the sparsity–undersampling tradeoff of the new algorithms equivalent to that of the corresponding convex optimization procedures, inspired by belief propagation in graphical models.
Posted Content

Proximal Splitting Methods in Signal Processing

Abstract: The proximity operator of a convex function is a natural extension of the notion of a projection operator onto a convex set. This tool, which plays a central role in the analysis and the numerical solution of convex optimization problems, has recently been introduced in the arena of signal processing, where it has become increasingly important. In this paper, we review the basic properties of proximity operators which are relevant to signal processing and present optimization methods based on these operators. These proximal splitting methods are shown to capture and extend several well-known algorithms in a unifying framework. Applications of proximal methods in signal recovery and synthesis are discussed.

Computational Methods for Sparse Solution of Linear Inverse Problems In many engineering areas, such as signal processing, practical results can be obtained by identifying approaches that yield the greatest quality improvement, or by selecting the most suitable computation methods.

TL;DR: In this paper, a survey of the major practical algorithms for sparse approximation is presented, focusing on computational issues, circumstances in which individual methods tend to perform well, and theoretical guarantees available.
Journal ArticleDOI

A non-adapted sparse approximation of PDEs with stochastic inputs

TL;DR: The method converges in probability as a consequence of sparsity and a concentration of measure phenomenon on the empirical correlation between samples, and it is shown that the method is well suited for truly high-dimensional problems.
References
More filters
Posted Content

Decoding by Linear Programming

TL;DR: In this paper, it was shown that under suitable conditions on the coding matrix, the input vector can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program).
Book

Regularization of Inverse Problems

TL;DR: Inverse problems have been studied in this article, where Tikhonov regularization of nonlinear problems has been applied to weighted polynomial minimization problems, and the Conjugate Gradient Method has been used for numerical realization.
Book

Convex analysis and variational problems

TL;DR: In this article, the authors consider non-convex variational problems with a priori estimate in convex programming and show that they can be solved by the minimax theorem.
Journal ArticleDOI

An Iterative Thresholding Algorithm for Linear Inverse Problems with a Sparsity Constraint

TL;DR: It is proved that replacing the usual quadratic regularizing penalties by weighted 𝓁p‐penalized penalties on the coefficients of such expansions, with 1 ≤ p ≤ 2, still regularizes the problem.
Related Papers (5)