scispace - formally typeset
Open AccessBook

Optimization and nonsmooth analysis

Reads0
Chats0
TLDR
The Calculus of Variations as discussed by the authors is a generalization of the calculus of variations, which is used in many aspects of analysis, such as generalized gradient descent and optimal control.
Abstract
1. Introduction and Preview 2. Generalized Gradients 3. Differential Inclusions 4. The Calculus of Variations 5. Optimal Control 6. Mathematical Programming 7. Topics in Analysis.

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted Content

Gradient Descent Maximizes the Margin of Homogeneous Neural Networks

TL;DR: In this article, the authors study the implicit regularization of the gradient descent algorithm in homogeneous neural networks, including fully-connected and convolutional neural networks with ReLU or LeakyReLU activations.
Journal ArticleDOI

Globally convergent variable metric method for convex nonsmooth unconstrained minimization

TL;DR: In this article, a special variable metric method is given for finding minima of convex functions that are not necessarily differentiable, and some encouraging numerical experience is reported for global convergence of the method.
Journal ArticleDOI

Globally Convergent Newton Methods for Nonsmooth Equations

TL;DR: These methods resemble the well-known family of damped Newton and Gauss-Newton methods for solving systems of smooth equations and generalize some recent Newton-like methods for solve B-differentiable equations which arise from various mathematical programs.
Journal ArticleDOI

New Optimality Conditions for the Semivectorial Bilevel Optimization Problem

TL;DR: The problem is transformed into a scalar-objective optimization problem with inequality constraints by means of the well-known optimal value reformulation and the conditions obtained reduce to those of a usual bilevel program, if the lower-level objective function becomes single-valued.