scispace - formally typeset
Journal ArticleDOI

The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem

Marcos Raydan
- 01 Jan 1997 - 
- Vol. 7, Iss: 1, pp 26-33
TLDR
Results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in theNumber of gradient evaluations.
Abstract
The Barzilai and Borwein gradient method for the solution of large scale unconstrained minimization problems is considered. This method requires few storage locations and very inexpensive computations. Furthermore, it does not guarantee descent in the objective function and no line search is required. Recently, the global convergence for the convex quadratic case has been established. However, for the nonquadratic case, the method needs to be incorporated in a globalization scheme. In this work, a nonmonotone line search strategy that guarantees global convergence is combined with the Barzilai and Borwein method. This strategy is based on the nonmonotone line search technique proposed by Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707--716]. Numerical results to compare the behavior of this method with recent implementations of the conjugate gradient method are presented. These results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in the number of gradient evaluations.

read more

Citations
More filters
Journal ArticleDOI

Minimization Algorithms Based on Supervisor and Searcher Cooperation

TL;DR: This work proposes a class of algorithms within this framework for the design of new minimization algorithms with desirable characteristics, namely, supervisor-searcher cooperation, and examines a gradient algorithm in the class.
Journal ArticleDOI

Nonmonotone line search methods with variable sample size

TL;DR: Nonmonotonicity of the line search combines well with the variable sample size scheme as it allows more freedom in choosing the search direction and the step size while the sample size is not the maximal one and increases the chances of finding a global solution.
Journal ArticleDOI

A Barzilai-Borwein conjugate gradient method

TL;DR: If choosing the parameter in the method by combining the Barzilai-Borwein idea, the initial numerical experiments show that one of the variants, BBCG3, is specially efficient among many others without line searches and might enjoy the asymptotical one stepsize per line-search property and become a strong candidate for large-scale nonlinear optimization.
Journal ArticleDOI

Retrieved optical properties of thin films on absorbing substrates from transmittance measurements by application of a spectral projected gradient method

TL;DR: In this article, a spectral projected gradient method (SPGM) is used to minimize the difference between measured and computed transmittance spectra, which is a powerful tool to solve efficiently the problem of obtaining the optical constants of thin films from spectroscopic measurements.
Journal ArticleDOI

Convergence of descent method without line search

TL;DR: A new descent method without line search for unconstrained optimization problems is proposed, and theoretically the global convergence of the new algorithm under mild conditions is analyzed.
References
More filters
Book

Practical Methods of Optimization

TL;DR: The aim of this book is to provide a Discussion of Constrained Optimization and its Applications to Linear Programming and Other Optimization Problems.
Book

Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)

TL;DR: In this paper, Schnabel proposed a modular system of algorithms for unconstrained minimization and nonlinear equations, based on Newton's method for solving one equation in one unknown convergence of sequences of real numbers.
Book

Numerical methods for unconstrained optimization and nonlinear equations

TL;DR: Newton's Method for Nonlinear Equations and Unconstrained Minimization and methods for solving nonlinear least-squares problems with Special Structure.
Journal ArticleDOI

Two-Point Step Size Gradient Methods

TL;DR: Etude de nouvelles methodes de descente suivant le gradient for the solution approchee du probleme de minimisation sans contrainte. as mentioned in this paper.
Related Papers (5)