scispace - formally typeset
Book ChapterDOI

A method of conjugate subgradients for minimizing nondifferentiable functions

Philip Wolfe
- 01 Dec 1974 - 
- Vol. 7, Iss: 1, pp 145-173
Reads0
Chats0
TLDR
In this paper, an algorithm for finding the minimum of any convex, not necessarily differentiable, function f of several variables is described, which yields a sequence of points tending to the solution of the problem, if any, requiring only the calculation of f and one subgradient of f at designated points.
Abstract
An algorithm is described for finding the minimum of any convex, not necessarily differentiable, function f of several variables. The algorithm yields a sequence of points tending to the solution of the problem, if any, requiring only the calculation of f and one subgradient of f at designated points. Its rate of convergence is estimated for convex and for differentiable convex functions. For the latter, it is an extension of the method of conjugate gradients and terminates for quadratic functions.

read more

Citations
More filters
Journal ArticleDOI

Semismooth and Semiconvex Functions in Constrained Optimization

TL;DR: In this paper, the authors introduce semismooth and semiconvex functions and discuss their properties with respect to nonsmooth nonconvex constrained optimization problems and give a chain rule for generalized gradients.

A survey of nonlinear conjugate gradient methods

TL;DR: In this article, the development of dierent versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties, is reviewed, with a focus on the convergence properties of the dierent methods.
Journal ArticleDOI

Accelerating Benders Decomposition: Algorithmic Enhancement and Model Selection Criteria

TL;DR: A new technique for accelerating the convergence of the algorithm and theory for distinguishing “good” model formulations of a problem that has distinct but equivalent mixed integer programming representations is introduced.
Book

Convex Optimization Theory

TL;DR: An insightful, concise, and rigorous treatment of the basic theory of convex sets and functions in finite dimensions, and the Dual problem the feasible if it is they, and how to relax the hessian matrix in terms of linear programming.
References
More filters
Journal ArticleDOI

Methods of Conjugate Gradients for Solving Linear Systems

TL;DR: An iterative algorithm is given for solving a system Ax=k of n linear equations in n unknowns and it is shown that this method is a special case of a very general method which also includes Gaussian elimination.
Journal ArticleDOI

Validation of subgradient optimization

TL;DR: It is concluded that the “relaxation” procedure for approximately solving a large linear programming problem related to the traveling-salesman problem shows promise for large-scale linear programming.
Journal ArticleDOI

Convergence Conditions for Ascent Methods. II

Philip Wolfe
- 01 Apr 1969 - 
TL;DR: In this article, the authors give liberal conditions on the steps of a "descent" method for finding extrema of a function and show that most known results are special cases.
Journal ArticleDOI

The decomposition algorithm for linear programs

George B. Dantzig, +1 more
- 01 Oct 1961 - 
TL;DR: In this article, a procedure for the efficient computational solution of linear programs having a certain structural property characteristic of a large class of problems of practical interest is presented, which makes possible the decomposition of the problem into a sequence of small linear programs whose iterated solutions solve the given problem through a generalization of the simplex method for linear programming.