scispace - formally typeset
Open AccessJournal ArticleDOI

Partial extensions of Attouch’s theorem with applications to proto-derivatives of subgradient mappings

Adam B. Levy, +2 more
- 01 Apr 1995 - 
- Vol. 347, Iss: 4, pp 1269-1294
Reads0
Chats0
TLDR
In this article, partial extensions of Attouch's Theorem to functions more general than convex are presented, called primal-lower-nice functions, which are defined in terms of the epi-convergence or graph convergence of certain difference quotient mappings.
Abstract
Attouch's Theorem, which gives on a reflexive Banach space the equivalence between the Mosco epi-convergence of a sequence of convex functions and the graph convergence of the associated sequence of subgradients, has many important applications in convex optimization. In particular, generalized derivatives have been defined in terms of the epi-convergence or graph convergence of certain difference quotient mappings, and Attouch's Theorem has been used to relate these various generalized derivatives. These relations can then be used to study the stability of the solution mapping associated with a parameterized family of optimization problems. We prove in a Hilbert space several "partial extensions" of Attouch's Theorem to functions more general than convex; these functions are called primal-lower-nice. Furthermore, we use our extensions to derive a relationship between the second-order epi-derivatives of primal-lower-nice functions and the proto-derivative of their associated subgradient mappings.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Local differentiability of distance functions

TL;DR: In this article, a local theory was developed for the property of a distance function being continuously differentiable outside of C on some neighborhood of a point x ∈ C, which is equivalent to the prox-regularity of C at x, a condition on normal vectors that is commonly fulfilled in variational analysis and has the advantage of being verifiable by calculation.
Journal ArticleDOI

Prox-regular functions in variational analysis

TL;DR: The class of prox-regular functions covers all lsc, proper, convex functions, lower-C2 functions and strongly amenable functions, hence a large core of functions of interest in variational analysis and optimization as mentioned in this paper.
Journal ArticleDOI

Tilt Stability of a Local Minimum

TL;DR: The classical condition of a positive-definite Hessian in smooth problems without constraints is found to have an exact counterpart much more broadly in the positivity of a certain generalized Hessian mapping.
Journal ArticleDOI

Prox-regular functions in Hilbert spaces

TL;DR: In this article, the prox-regularity concept for functions in the general context of Hilbert space is studied and a subdifferential characterization is established as well as several other properties.
Journal ArticleDOI

Generalized Hessian Properties of Regularized Nonsmooth Functions

TL;DR: The results establish that generalized second-order expansions of Moreau envelopes, at least, can be counted on in most situations of interest in finite-dimensional optimization.
References
More filters
Book

Optimization and nonsmooth analysis

TL;DR: The Calculus of Variations as discussed by the authors is a generalization of the calculus of variations, which is used in many aspects of analysis, such as generalized gradient descent and optimal control.
Journal ArticleDOI

A smooth variational principle with applications to subdifferentiability and to differentiability of convex functions

TL;DR: In this article, it was shown that lower semicontinuous functions on a Banach space densely inherit lower subderivatives of the same degree of smoothness as the norm.
Book

Methods of dynamic and nonsmooth optimization

TL;DR: In this article, the basic problem in the Calculus of Variations Verification Functions and Dynamic Programming Optimal Control is discussed, as well as the problem of verifying the correctness of variations.
Journal ArticleDOI

First- and second-order epi-differentiability in nonlinear programming

TL;DR: In this paper, the essential objective function, which is the sum of the given objective and the indicator of the constraints, is shown to be twice epi-differentiable at any point where the active constraints satisfy the Mangasarian-Fromovitz qualification.