Local Linear Convergence for Alternating and Averaged Nonconvex Projections
Summary (1 min read)
1 Introduction
- The authors interest here is not in the development of practical numerical methods.
- Notwithstanding linear convergence proofs, basic alternating and averaged projection schemes may be slow in practice.
- Rather the authors aim to study the interplay between a simple, popular, fundamental algorithm and a variety of central ideas from variational analysis.
- Whether such an approach can help in the design and analysis of more practical algorithms remains to be seen.
Corollary 4.10 (approximate monotonicity)
- If the authors replace the normal cone N C in the property described in the result above by its convex hull, the "Clarke normal cone", they obtain a stronger property, called "subsmoothness" in [4] .
- Similar proofs to those above show that, like super-regularity, subsmoothness is a consequence of either amenability or prox-regularity.
- Subsmoothness is strictly stronger than superregularity.
- In a certain sense, however, the distinction between subsmoothness and super-regularity is slight.
- Since super-regularity implies Clarke regularity, the normal cone and Clarke normal cone coincide throughout F ∩ U, and hence F is also subsmooth throughout F ∩ U.
Theorem 5.2 (linear convergence of alternating projections)
- Adding this inequality to the previous inequality then gives the right-hand side of (5.7), as desired.
- The authors can now easily check that the sequence (x k ) is Cauchy and therefore converges.
- Then any alternating projection sequence with initial point sufficiently near x must converge to a point in F ∩ C with R-linear rate √ c. Proof.
- The authors have shown that c also controls the speed of linear convergence for the method of alternating projections applied to the sets F and C. Inevitably, Theorem 5.16 concerns local convergence: it relies on finding an initial point x 0 sufficiently close to a point of linearly regular intersection.
- One example is the case of two manifolds [30] .
8 Prox-regularity and averaged projections
- For each iteration k. Random examples are interesting for their simple test of averaged projections: the challenging question of checking a priori the linear regularity of the intersection of the three sets is open, but randomness seems to prevent irregular solutions, providing α is not too small.
- So in this situation, the authors would hope that the algorithm will converge locally linearly; this is indeed what the numerical results in Figure 9 suggest.
- The authors observed that the method still appears locally linearly convergent in practice, and again, that the rate is better than for averaged projections.
- This example illustrates how the projection algorithm behaves on random feasibility problems of this type.
- Further study and more complete testing have to be done for these questions; this is beyond the scope of this paper.
Did you find this useful? Give us your feedback
Citations
1,282 citations
1,008 citations
569 citations
Cites background or result from "Local Linear Convergence for Altern..."
...A part of this result is inspired by the recent work of Lewis and Malick on transverse manifolds [36] (and also [37]), in which similar results were derived....
[...]
...(b) for related, but different, results see [37, 41]....
[...]
317 citations
Cites background from "Local Linear Convergence for Altern..."
...[23], we can endow the product space E with the inner product 〈 u1 u2 um v1 v2 vm 〉= m ∑...
[...]
242 citations
Cites methods from "Local Linear Convergence for Altern..."
... \( N D(x)) = f0g; where N A(a) = @ A(a) denotes the limiting normal cone of a closed set Aat a2A; see, for example, [37]. This latter condition was widely used in the literature (see, for example, [21, 23]) for establishing local linear convergence of algorithms for solving the nonconvex feasibility problem, that is, nding a point in C\D. In addition, we note from the proof of Theorem 3.4 that conditio...
[...]
References
[...]
18,609 citations
14,587 citations
2,187 citations
"Local Linear Convergence for Altern..." refers background in this paper
...Candes and Romberg [9] showed that, under orthogonality conditions, sparse recovery is more efficient when the entries |(PW )ij| are small....
[...]
...An initial investigation on this question is [21]; we suggest here another direction, inspired by [9] and [22], where averaged projections naturally appear....
[...]
1,918 citations
"Local Linear Convergence for Altern..." refers background in this paper
...The centrality of this idea in variational analysis is described at length in [12, 34, 44])....
[...]
1,742 citations
Related Papers (5)
Frequently Asked Questions (12)
Q2. What is the main theme in computational mathematics?
An important theme in computational mathematics is the relationship between “conditioning” of a problem instance and speed of convergence of iterative solution algorithms on that instance.
Q3. What is the definition of Clarke regularity?
“Clarke regularity” is a basic variation-geometric property of sets, shared in particular by closed convex sets and smooth manifolds.
Q4. What is the main result of the method of averaged projections?
Their main result shows, assuming only linear regularity, that providing the initial point x0 is sufficiently near x̄, any sequence x1, x2, x3, . . . generated by the method of averaged projections converges linearly to a point in the intersection ∩iFi, at a rate governed by the condition modulus.
Q5. What are some examples of nonconvex alternating projection algorithms?
nonconvex alternating projection algorithms and analogous heuristics are quite popular in practice, in areas such as inverse eigenvalue problems [10,11], pole placement [35,51], information theory [48], low-order control design [23,24,36] and image processing [7, 50].
Q6. What is the sequence of even iterates for the latter method?
If x0, x1, x2, . . . is a possible sequence of iterates for the former method, then a possible sequence of even iterates for the latter method is Ax0, Ax1, Ax2, . . ..
Q7. What is the case of exact projection on the super-regular set C?
The authors might reasonably consider the case of exact projection on the super-regular set C: for example, in the next section, for the method of averaged projections, C is a subspace and computing projections is trivial.
Q8. How can the authors understand the rate of convergence of conic convex programming?
More generally, Renegar [41–43] showed that the rate of convergence of interior-point methods for conic convex programming can be bounded in terms of the “distance to ill-posedness” of the program.
Q9. how many ms is the optimal value of the condition modulus?
By equation (3.3) and the definition of the condition modulus, the optimal value of this new problem is1 − 1 m · cond2(F1, F2, . . . , Fm|x̄)as required.
Q10. What is the proof of linear convergence?
Their linear convergence proof is elementary: although the authors use the idea of the normal cone, the authors apply only the definition,and the authors discuss metric regularity only to illuminate the rate of convergence.
Q11. What is the prox-regular function of the first set L?
The first set L is a subspace, the second set M is a smooth manifold while the third C is convex; hence the three are prox-regular.
Q12. What is the definition of linear regularity?
The notion of linear regularity is well-known to be closely related to another central idea in variational analysis: “metric regularity”.