scispace - formally typeset
Open AccessJournal ArticleDOI

Convergence rates and source conditions for Tikhonov regularization with sparsity constraints

Dirk A. Lorenz
- 01 Jan 2008 - 
- Vol. 16, Iss: 5, pp 463-478
Reads0
Chats0
TLDR
In this article, the regularization by sparsity constraints by means of weighted penalties for the case p = 1 and p = 2 has been studied, and it is shown that the convergence rate depends on the interplay of the operator and the basis of sparsity.
Abstract
This paper addresses the regularization by sparsity constraints by means of weighted $\ell^p$ penalties for $0\leq p\leq 2$. For $1\leq p\leq 2$ special attention is payed to convergence rates in norm and to source conditions. As main result it is proven that one gets a convergence rate in norm of $\sqrt{\delta}$ for $1\leq p\leq 2$ as soon as the unknown solution is sparse. The case $p=1$ needs a special technique where not only Bregman distances but also a so-called Bregman-Taylor distance has to be employed. For $p<1$ only preliminary results are shown. These results indicate that, different from $p\geq 1$, the regularizing properties depend on the interplay of the operator and the basis of sparsity. A counterexample for $p=0$ shows that regularization need not to happen.

read more

Citations
More filters
Journal ArticleDOI

Linear Convergence of Iterative Soft-Thresholding

TL;DR: In this article, a unified approach to iterative soft thresholding algorithms for the solution of linear operator equations in infinite dimensional Hilbert spaces is presented, and a new convergence analysis is presented.
Journal ArticleDOI

Linear convergence of iterative soft-thresholding

TL;DR: A unified approach to iterative soft-thresholding algorithms for the solution of linear operator equations in infinite dimensional Hilbert spaces is presented and it is shown that the constants can be calculated explicitly in special cases.
Journal ArticleDOI

Inverse problems in spaces of measures

TL;DR: In this article, the authors considered the ill-posed problem of solving linear equations in the space of vector-valued finite Radon measures with Hilbert space data and obtained approximate solutions by minimizing the Tikhonov functional with a total variation penalty.
Journal ArticleDOI

Sparse Regularization with l q Penalty Term

TL;DR: In this paper, the authors consider the stable approximation of sparse solutions to nonlinear operator equations by means of Tikhonov regularization with a subquadratic penalty term and derive the usual convergence rate of the regularized solutions in dependence of the noise level.
Journal ArticleDOI

NETT: solving inverse problems with deep neural networks

TL;DR: In this paper, the authors established a complete convergence analysis for the proposed Network Tikhonov (NETT) approach to inverse problems, which considers data consistent solutions having small value of a regularizer defined by a trained neural network.
References
More filters
Journal ArticleDOI

An Iterative Thresholding Algorithm for Linear Inverse Problems with a Sparsity Constraint

TL;DR: It is proved that replacing the usual quadratic regularizing penalties by weighted 𝓁p‐penalized penalties on the coefficients of such expansions, with 1 ≤ p ≤ 2, still regularizes the problem.
Journal ArticleDOI

Optimally sparse representation in general (nonorthogonal) dictionaries via 1 minimization

TL;DR: This article obtains parallel results in a more general setting, where the dictionary D can arise from two or several bases, frames, or even less structured systems, and sketches three applications: separating linear features from planar ones in 3D data, noncooperative multiuser encoding, and identification of over-complete independent component models.
Journal ArticleDOI

A convergence rates result for Tikhonov regularization in Banach spaces with non-smooth operators

TL;DR: In this article, the authors show that violations of the smoothness assumptions of the operator do not necessarily affect the convergence rate negatively, and they take this observation and weaken the smoothing assumptions on the operator and prove a novel convergence rate result.
Journal ArticleDOI

Recovery Algorithms for Vector-Valued Data with Joint Sparsity Constraints

TL;DR: In this article, a thresholded Landweber algorithm is proposed to compute solutions of linear inverse problems with joint sparsity regularization constraints by fast thresholded LDA, which is interpreted as a double-minimization scheme for a suitable target functional.
Journal ArticleDOI

Regularization of ill-posed problems in Banach spaces: convergence rates

Elena Resmerita
- 01 Aug 2005 - 
Abstract: This paper deals with quantitative aspects of regularization for ill-posed linear equations in Banach spaces, when the regularization is done using a general convex penalty functional. The error estimates shown here by means of Bregman distances yield better convergences rates than those already known for maximum entropy regularization, as well as for total variation regularization.
Related Papers (5)