scispace - formally typeset
Open AccessJournal ArticleDOI

Quantitative convergence analysis of iterated expansive, set-valued mappings

Reads0
Chats0
TLDR
In this paper, the authors develop a framework for quantitative convergence analysis of Picard iterations of expansive set-valued fixed point mappings, and prove local linear convergence of nonconvex cyclic projections for inconsistent (and consistent) feasibility problems.
Abstract
We develop a framework for quantitative convergence analysis of Picard iterations of expansive set-valued fixed point mappings. There are two key components of the analysis. The first is a natural generalization of single-valued averaged mappings to expansive set-valued mappings that characterizes a type of strong calmness of the fixed point mapping. The second component to this analysis is an extension of the well-established notion of metric subregularity - or inverse calmness - of the mapping at fixed points. Convergence of expansive fixed point iterations is proved using these two properties, and quantitative estimates are a natural by-product of the framework. To demonstrate the application of the theory, we prove, for the first time, a number of results showing local linear convergence of nonconvex cyclic projections for inconsistent (and consistent) feasibility problems, local linear convergence of the forward-backward algorithm for structured optimization without convexity, strong or otherwise, and local linear convergence of the Douglas-Rachford algorithm for structured nonconvex minimization. This theory includes earlier approaches for known results, convex and nonconvex, as special cases.

read more

Content maybe subject to copyright    Report

Citations
More filters
Book ChapterDOI

SET-Valued Analysis

TL;DR: “Multivalued Analysis” is the theory of set-valued maps (called multifonctions) and has important applications in many different areas and there is no doubt that a modern treatise on “Nonlinear functional analysis” can not afford the luxury of ignoring multivalued analysis.
Journal ArticleDOI

Relaxed averaged alternating reflections for diffraction imaging

TL;DR: A relaxation of averaged alternating reflectors and determine the fixed-point set of the related operator in the convex case is proposed and the effectiveness of the algorithm compared to the current state of the art is demonstrated.
Journal ArticleDOI

Golden ratio algorithms for variational inequalities

TL;DR: In this article, a fully adaptive algorithm for monotone variational inequalities is presented, which uses two previous iterates for an approximation of the local Lipschitz constant without running a linesearch.
Posted Content

Golden Ratio Algorithms for Variational Inequalities

TL;DR: A fully explicit algorithm for monotone variational inequalities that uses variable stepsizes that are computed using two previous iterates as an approximation of the local Lipschitz constant without running a linesearch.
Journal ArticleDOI

Necessary conditions for linear convergence of iterated expansive, set-valued mappings

TL;DR: In this article, the authors present necessary conditions for monotonicity of fixed point iterations of mappings that may violate the usual nonexpansive property, and specialize these results to the alternating projections iteration where the metric subregularity property takes on a distinct geometric characterization of sets at points of intersection called subtransversality.
References
More filters
Journal ArticleDOI

A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems

TL;DR: A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.
Book

Convex Analysis and Monotone Operator Theory in Hilbert Spaces

TL;DR: This book provides a largely self-contained account of the main results of convex analysis and optimization in Hilbert space, and a concise exposition of related constructive fixed point theory that allows for a wide range of algorithms to construct solutions to problems in optimization, equilibrium theory, monotone inclusions, variational inequalities, and convex feasibility.
Posted Content

An iterative thresholding algorithm for linear inverse problems with a sparsity constraint

Abstract: We consider linear inverse problems where the solution is assumed to have a sparse expansion on an arbitrary pre-assigned orthonormal basis. We prove that replacing the usual quadratic regularizing penalties by weighted l^p-penalties on the coefficients of such expansions, with 1 < or = p < or =2, still regularizes the problem. If p < 2, regularized solutions of such l^p-penalized problems will have sparser expansions, with respect to the basis under consideration. To compute the corresponding regularized solutions we propose an iterative algorithm that amounts to a Landweber iteration with thresholding (or nonlinear shrinkage) applied at each iteration step. We prove that this algorithm converges in norm. We also review some potential applications of this method.
Journal ArticleDOI

Mean value methods in iteration

TL;DR: In this article, it is shown that the Schauder fixpoint theorem can play a somewhat analogous role in the theory of divergent iteration processes, and that the same methods can be used to prove that a given problem has a solution.
Related Papers (5)