scispace - formally typeset
Journal ArticleDOI

The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem

Marcos Raydan
- 01 Jan 1997 - 
- Vol. 7, Iss: 1, pp 26-33
TLDR
Results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in theNumber of gradient evaluations.
Abstract
The Barzilai and Borwein gradient method for the solution of large scale unconstrained minimization problems is considered. This method requires few storage locations and very inexpensive computations. Furthermore, it does not guarantee descent in the objective function and no line search is required. Recently, the global convergence for the convex quadratic case has been established. However, for the nonquadratic case, the method needs to be incorporated in a globalization scheme. In this work, a nonmonotone line search strategy that guarantees global convergence is combined with the Barzilai and Borwein method. This strategy is based on the nonmonotone line search technique proposed by Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707--716]. Numerical results to compare the behavior of this method with recent implementations of the conjugate gradient method are presented. These results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in the number of gradient evaluations.

read more

Citations
More filters
Book

Machine Learning : A Probabilistic Perspective

TL;DR: This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
Journal ArticleDOI

Nonmonotone Spectral Projected Gradient Methods on Convex Sets

TL;DR: The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo--Lampariello--Lucidi non monotone line search that is combined with the spectral gradient choice of steplENGTH to accelerate the convergence process.
Journal ArticleDOI

A feasible method for optimization with orthogonality constraints

TL;DR: The Cayley transform is applied—a Crank-Nicolson-like update scheme—to preserve the constraints and based on it, curvilinear search algorithms with lower flops are developed with high efficiency for polynomial optimization, nearest correlation matrix estimation and extreme eigenvalue problems.
Journal ArticleDOI

A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization

TL;DR: For the L-BFGS method and the unconstrained optimization problems in the CUTE library, the new nonmonotone line search algorithm used fewer function and gradient evaluations, on average, than either the monotone or the traditional nonMonotone scheme.

An Unconstrained Optimization Test Functions Collection

TL;DR: A collection of unconstrained optimization test functions is presented to give to the optimization community a large number of general test functions to be used in testing the unconStrained optimization algorithms and comparisons studies.
References
More filters
Journal ArticleDOI

Testing Unconstrained Optimization Software

TL;DR: A relatwely large but easy-to-use collection of test functions and designed gmdelines for testing the reliability and robustness of unconstrained optimization software.
Journal ArticleDOI

A nonmonotone line search technique for Newton's method

TL;DR: In this paper, a nonmonotone steplength selection rule for Newton's method is proposed, which can be viewed as a generalization of Armijo's rule.
Journal ArticleDOI

Global Convergence Properties of Conjugate Gradient Methods for Optimization

TL;DR: This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches, covering two classes of methods that are globally convergent on smooth, nonconvex functions.
Journal ArticleDOI

On the Barzilai and Borwein choice of steplength for the gradient method

TL;DR: In this article, the convergence of the Barzilai and Borwein gradient method was established for the minimization of a strictly convex quadratic function of any number of variables.
Book ChapterDOI

Nonconvex minimization calculations and the conjugate gradient method

TL;DR: This work considers the global convergence of conjugate gradient methods without restarts, assuming exact arithmetic and exact line searches, when the objective function is twice continuously differentiable and has bounded level sets.
Related Papers (5)