scispace - formally typeset
Open AccessBook

Optimization and nonsmooth analysis

TLDR
The Calculus of Variations as discussed by the authors is a generalization of the calculus of variations, which is used in many aspects of analysis, such as generalized gradient descent and optimal control.
Abstract
1. Introduction and Preview 2. Generalized Gradients 3. Differential Inclusions 4. The Calculus of Variations 5. Optimal Control 6. Mathematical Programming 7. Topics in Analysis.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Nondifferentiable Multiplier Rules for Optimization and Bilevel Optimization Problems

TL;DR: Necessary and sufficient optimality conditions and constraint qualifications in terms of the Michel--Penot subdifferential are given, and the results are applied to bilevel optimization problems.
Journal ArticleDOI

On a theorem of Danskin with an application to a theorem of Von Neumann-Sion

TL;DR: Several versions of Danskin's theorem, which deal with the derivative (or subdifferential) of the upper envelope J̄(u) = supv J(u, v) of a family of functions, are given as mentioned in this paper.
Journal ArticleDOI

Finite-Time Consensus of Opinion Dynamics and its Applications to Distributed Optimization Over Digraph

TL;DR: Some efficient criteria for finite-time consensus of a class of nonsmooth opinion dynamics over a digraph are established and the lower and upper bounds on the finite settling time are obtained based respectively on the maximal and minimal cut capacity of the digraph.
Journal ArticleDOI

Semidifferentiable functions and necessary optimality conditions

TL;DR: In this article, a necessary optimality condition for constrained extremum problems with a finite-dimensional image has been proposed, while those having an infinite-dimensional one will be treated in a subsequent paper.
Posted Content

Semismooth Newton Coordinate Descent Algorithm for Elastic-Net Penalized Huber Loss Regression and Quantile Regression

TL;DR: An algorithm, semismooth Newton coordinate descent (SNCD), for the elastic-net penalized Huber loss regression and quantile regression in high dimensional settings and an adaptive version of the “strong rule” for screening predictors to gain extra efficiency.