scispace - formally typeset
Journal ArticleDOI

The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem

Marcos Raydan
- 01 Jan 1997 - 
- Vol. 7, Iss: 1, pp 26-33
TLDR
Results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in theNumber of gradient evaluations.
Abstract
The Barzilai and Borwein gradient method for the solution of large scale unconstrained minimization problems is considered. This method requires few storage locations and very inexpensive computations. Furthermore, it does not guarantee descent in the objective function and no line search is required. Recently, the global convergence for the convex quadratic case has been established. However, for the nonquadratic case, the method needs to be incorporated in a globalization scheme. In this work, a nonmonotone line search strategy that guarantees global convergence is combined with the Barzilai and Borwein method. This strategy is based on the nonmonotone line search technique proposed by Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707--716]. Numerical results to compare the behavior of this method with recent implementations of the conjugate gradient method are presented. These results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in the number of gradient evaluations.

read more

Citations
More filters
Posted Content

Geometric Convergence for Distributed Optimization with Barzilai-Borwein Step Sizes

TL;DR: Simulation results on a distributed sensing problem show that the proposed distributed gradient method with BB step sizes has geometric convergence to the optimal solution and is superior to some advanced methods in terms of iterations, gradient evaluations, communications and the related cost framework.
Journal ArticleDOI

Local analysis of a spectral correction for the Gauss-Newton model applied to quadratic residual problems

TL;DR: Under mild assumptions, the proposed method is proved to be convergent for problems for which the convergence of the Gauss-Newton method might not be ensured, and the rate of linear convergence is proving to be better than the Gaussian model’s one for a class of non-zero residue problems.
Book ChapterDOI

A Projection Method for Optimization Problems on the Stiefel Manifold

TL;DR: This paper proposes a feasible method based on projections using a curvilinear search for solving optimization problems with orthogonality constraints that computes the SVD decomposition in each iteration in order to preserve feasibility.
Journal ArticleDOI

New stepsizes for the gradient method

TL;DR: A new framework to generate stepsizes for gradient methods applied to convex quadratic function minimization problems by adopting different criterions is proposed, which shows that the new methods enjoy lower complexity and outperform the existing gradient methods.
Journal ArticleDOI

The Hager–Zhang conjugate gradient algorithm for large-scale nonlinear equations

TL;DR: The Hager–Zhang (HZ) conjugate gradient (CG) algorithm is studied for large-scale smooth optimization problems and a modified HZ (MHZ) CG method is proposed.
References
More filters
Book

Practical Methods of Optimization

TL;DR: The aim of this book is to provide a Discussion of Constrained Optimization and its Applications to Linear Programming and Other Optimization Problems.
Book

Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)

TL;DR: In this paper, Schnabel proposed a modular system of algorithms for unconstrained minimization and nonlinear equations, based on Newton's method for solving one equation in one unknown convergence of sequences of real numbers.
Book

Numerical methods for unconstrained optimization and nonlinear equations

TL;DR: Newton's Method for Nonlinear Equations and Unconstrained Minimization and methods for solving nonlinear least-squares problems with Special Structure.
Journal ArticleDOI

Two-Point Step Size Gradient Methods

TL;DR: Etude de nouvelles methodes de descente suivant le gradient for the solution approchee du probleme de minimisation sans contrainte. as mentioned in this paper.
Related Papers (5)