scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Nonconvex Notions of Regularity and Convergence of Fundamental Algorithms for Feasibility Problems

03 Dec 2013-Siam Journal on Optimization (Society for Industrial and Applied Mathematics)-Vol. 23, Iss: 4, pp 2397-2419
TL;DR: In this article, a notion of local subfirm nonexpansiveness with respect to the intersection is introduced for consistent feasibility problems, together with a coercivity condition that relates to the regularity of the collection of sets at points in the intersection, yields local linear convergence of AP for a wide class of nonconvex problems.
Abstract: We consider projection algorithms for solving (nonconvex) feasibility problems in Euclidean spaces. Of special interest are the method of alternating projections (AP) and the Douglas--Rachford algorithm (DR). In the case of convex feasibility, firm nonexpansiveness of projection mappings is a global property that yields global convergence of AP and for consistent problems DR. A notion of local subfirm nonexpansiveness with respect to the intersection is introduced for consistent feasibility problems. This, together with a coercivity condition that relates to the regularity of the collection of sets at points in the intersection, yields local linear convergence of AP for a wide class of nonconvex problems and even local linear convergence of nonconvex instances of the DR algorithm.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: In this article, the convergence rate bound for Douglas-Rachford splitting and ADMM under strong convexity and smoothness assumptions is shown. And the convergence bound is tight for the class of problems under consideration for all feasible algorithm parameters.
Abstract: Recently, several convergence rate results for Douglas-Rachford splitting and the alternating direction method of multipliers (ADMM) have been presented in the literature. In this paper, we show global linear convergence rate bounds for Douglas-Rachford splitting and ADMM under strong convexity and smoothness assumptions. We further show that the rate bounds are tight for the class of problems under consideration for all feasible algorithm parameters. For problems that satisfy the assumptions, we show how to select step-size and metric for the algorithm that optimize the derived convergence rate bounds. For problems with a similar structure that do not satisfy the assumptions, we present heuristic step-size and metric selection methods.

206 citations

Journal ArticleDOI
TL;DR: In this article, it was shown that the Douglas-Rachford splitting algorithm converges strongly to the projection of the starting point onto the intersection of two convex sets, and if the sum of the two subspaces is closed, then the convergence is linear with the rate being the cosine of the Friedrichs angle between the subspace.

119 citations

Journal ArticleDOI
TL;DR: In this paper, the authors adapt the Douglas-Rachford (DR) splitting method to solve nonconvex feasibility problems by studying this method for a class of nonlinear optimization problems, and show that if the step-size parameter is smaller than a computable threshold and the sequence generated has a cluster point, then it gives a stationary point of the optimization problem.
Abstract: We adapt the Douglas---Rachford (DR) splitting method to solve nonconvex feasibility problems by studying this method for a class of nonconvex optimization problem. While the convergence properties of the method for convex problems have been well studied, far less is known in the nonconvex setting. In this paper, for the direct adaptation of the method to minimize the sum of a proper closed function g and a smooth function f with a Lipschitz continuous gradient, we show that if the step-size parameter is smaller than a computable threshold and the sequence generated has a cluster point, then it gives a stationary point of the optimization problem. Convergence of the whole sequence and a local convergence rate are also established under the additional assumption that f and g are semi-algebraic. We also give simple sufficient conditions guaranteeing the boundedness of the sequence generated. We then apply our nonconvex DR splitting method to finding a point in the intersection of a closed convex set C and a general closed set D by minimizing the squared distance to C subject to D. We show that if either set is bounded and the step-size parameter is smaller than a computable threshold, then the sequence generated from the DR splitting method is actually bounded. Consequently, the sequence generated will have cluster points that are stationary for an optimization problem, and the whole sequence is convergent under an additional assumption that C and D are semi-algebraic. We achieve these results based on a new merit function constructed particularly for the DR splitting method. Our preliminary numerical results indicate that our DR splitting method usually outperforms the alternating projection method in finding a sparse solution of a linear system, in terms of both the solution quality and the number of iterations taken.

111 citations

Journal ArticleDOI
TL;DR: It is shown that under certain regularity conditions, the Douglas–Rachford method converges locally with -linear rate, and it is proved that the linear convergence is global.
Abstract: In this paper, we investigate the Douglas–Rachford method (DR) for two closed (possibly nonconvex) sets in Euclidean spaces. We show that under certain regularity conditions, the DR converges locally with -linear rate. In convex settings, we prove that the linear convergence is global. Our study recovers recent results on the same topic.

110 citations

Journal ArticleDOI
TL;DR: This work discusses recent positive experiences applying convex feasibility algorithms of Douglas–Rachford type to highly combinatorial and far from convex problems.
Abstract: We discuss recent positive experiences applying convex feasibility algorithms of Douglas---Rachford type to highly combinatorial and far from convex problems.

95 citations

References
More filters
Book
26 Apr 2011
TL;DR: This book provides a largely self-contained account of the main results of convex analysis and optimization in Hilbert space, and a concise exposition of related constructive fixed point theory that allows for a wide range of algorithms to construct solutions to problems in optimization, equilibrium theory, monotone inclusions, variational inequalities, and convex feasibility.
Abstract: This book provides a largely self-contained account of the main results of convex analysis and optimization in Hilbert space. A concise exposition of related constructive fixed point theory is presented, that allows for a wide range of algorithms to construct solutions to problems in optimization, equilibrium theory, monotone inclusions, variational inequalities, best approximation theory, and convex feasibility. The book is accessible to a broad audience, and reaches out in particular to applied scientists and engineers, to whom these tools have become indispensable.

3,905 citations

Journal ArticleDOI
TL;DR: This work studies two splitting algorithms for (stationary and evolution) problems involving the sum of two monotone operators with real-time requirements.
Abstract: Splitting algorithms for the sum of two monotone operators.We study two splitting algorithms for (stationary and evolution) problems involving the sum of two monotone operators. These algorithms ar...

1,939 citations

Journal ArticleDOI
TL;DR: A very broad and flexible framework is investigated which allows a systematic discussion of questions on behaviour in general Hilbert spaces and on the quality of convergence in convex feasibility problems.
Abstract: Due to their extraordinary utility and broad applicability in many areas of classical mathematics and modern physical sciences (most notably, computerized tomography), algorithms for solving convex feasibility problems continue to receive great attention. To unify, generalize, and review some of these algorithms, a very broad and flexible framework is investigated. Several crucial new concepts which allow a systematic discussion of questions on behaviour in general Hilbert spaces and on the quality of convergence are brought out. Numerous examples are given.

1,742 citations

Book
28 Sep 1990
TL;DR: In this paper, the basic fixed point theorems for non-pansive mappings are discussed and weak sequential approximations are proposed for linear mappings with normal structure and smoothness.
Abstract: Introduction 1. Preliminaries 2. Banach's contraction principle 3. Nonexpansive mappings: introduction 4. The basic fixed point theorems for nonexpansive mappings 5. Scaling the convexity of the unit ball 6. The modulus of convexity and normal structure 7. Normal structure and smoothness 8. Conditions involving compactness 9. Sequential approximation techniques 10. Weak sequential approximations 11. Properties of fixed point sets and minimal sets 12. Special properties of Hilbert space 13. Applications to accretivity 14. Nonstandard methods 15. Set-valued mappings 16. Uniformly Lipschitzian mappings 17. Rotative mappings 18. The theorems of Brouwer and Schauder 19. Lipschitzian mappings 20. Minimal displacement 21. The retraction problem References.

1,466 citations