scispace - formally typeset
Open AccessJournal ArticleDOI

Local differentiability of distance functions

Reads0
Chats0
TLDR
In this article, a local theory was developed for the property of a distance function being continuously differentiable outside of C on some neighborhood of a point x ∈ C, which is equivalent to the prox-regularity of C at x, a condition on normal vectors that is commonly fulfilled in variational analysis and has the advantage of being verifiable by calculation.
Abstract
Recently Clarke, Stern and Wolenski characterized, in a Hilbert space, the closed subsets C for which the distance function dC is continuously differentiable everywhere on an open “tube” of uniform thickness around C Here a corresponding local theory is developed for the property of dC being continuously differentiable outside of C on some neighborhood of a point x ∈ C This is shown to be equivalent to the prox-regularity of C at x, which is a condition on normal vectors that is commonly fulfilled in variational analysis and has the advantage of being verifiable by calculation Additional characterizations are provided in terms of dC being locally of class C 1+ or such that dC + σ| · |2 is convex around x for some σ > 0 Prox-regularity of C at x corresponds further to the normal cone mapping NC having a hypomonotone truncation around x, and leads to a formula for PC by way of NC The local theory also yields new insights on the global level of the Clarke-Stern-Wolenski results, and on a property of sets introduced by Shapiro, as well as on the concept of sets with positive reach considered by Federer in the finite dimensional setting

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

I and i

Kevin Barraclough
- 08 Dec 2001 - 
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Journal ArticleDOI

Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods

TL;DR: This work proves an abstract convergence result for descent methods satisfying a sufficient-decrease assumption, and allowing a relative error tolerance, that guarantees the convergence of bounded sequences under the assumption that the function f satisfies the Kurdyka–Łojasiewicz inequality.
Book

Convex analysis and nonlinear optimization : theory and examples

TL;DR: In this paper, the Karush-Kuhn-Tucker Theorem and Fenchel duality were used for infinite versus finite dimensions, with a list of results and notation.
Book

Semiconcave Functions, Hamilton-Jacobi Equations, and Optimal Control

TL;DR: In this paper, a model problem is formulated for optimal control of semiconcave functions with exit time with the objective of minimizing the cost of the control problem with respect to the exit time.
Journal ArticleDOI

Local Linear Convergence for Alternating and Averaged Nonconvex Projections

TL;DR: It is proved that von Neumann’s method of “alternating projections” converges locally to a point in the intersection, at a linear rate associated with a modulus of regularity.
References
More filters
Journal ArticleDOI

I and i

Kevin Barraclough
- 08 Dec 2001 - 
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Book

Optimization and nonsmooth analysis

TL;DR: The Calculus of Variations as discussed by the authors is a generalization of the calculus of variations, which is used in many aspects of analysis, such as generalized gradient descent and optimal control.
Book

Convex Functions

Book

Convex Functions, Monotone Operators and Differentiability

R. R. Phelps
TL;DR: Convex functions on real Banach spaces were studied in this paper, where a generalization of monotone operators, usco maps, were used for convex functions.