scispace - formally typeset
Journal ArticleDOI

The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem

Marcos Raydan
- 01 Jan 1997 - 
- Vol. 7, Iss: 1, pp 26-33
TLDR
Results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in theNumber of gradient evaluations.
Abstract
The Barzilai and Borwein gradient method for the solution of large scale unconstrained minimization problems is considered. This method requires few storage locations and very inexpensive computations. Furthermore, it does not guarantee descent in the objective function and no line search is required. Recently, the global convergence for the convex quadratic case has been established. However, for the nonquadratic case, the method needs to be incorporated in a globalization scheme. In this work, a nonmonotone line search strategy that guarantees global convergence is combined with the Barzilai and Borwein method. This strategy is based on the nonmonotone line search technique proposed by Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707--716]. Numerical results to compare the behavior of this method with recent implementations of the conjugate gradient method are presented. These results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in the number of gradient evaluations.

read more

Citations
More filters

RESEARCH ARTICLE Benchmarking Large Scale Distributed Convex

TL;DR: It turns out that the alternating direction method of multipliers (ADMM) and the restarted version of the fast gradient method are the best methods for solving decomposable QPs in terms of the number of necessary lower level QP solutions.
Book ChapterDOI

NPSOG: A New Hybrid Method for Unconstrained Differentiable Optimization

TL;DR: The experimental results for the test functions indicate that the usage of NPSOG algorithms can improve the performance of PSO considerably and its performance as a viable optimization method is demonstrated by comparing it with classical kind of hybridization.
Journal ArticleDOI

A New Nonmonotone Trust Region Barzilai-Borwein Method for Unconstrained Optimization Problems

TL;DR: Wang et al. as discussed by the authors proposed a nonmonotone trust region Barzilai-Borwein (BB for short) method for solving unconstrained optimization problems, which is given by a novel combination of a modified Metropolis criterion, BB-stepsize and trust region method.
Posted Content

A relaxed interior point method for low-rank semidefinite programming problems with applications to matrix completion

TL;DR: In this article, a new relaxed variant of interior point method for low-rank semidefinite programming problems is proposed, which is a step outside of the usual interior point framework.
Journal ArticleDOI

A New conjugate gradient method for unconstrained optimization problems with descent property

TL;DR: In this paper, a new conjugate gradient method for solving nonlinear unconstrained optimization is proposed, which consists of three parts, the first part is the parameter of Hestenes-Stiefel (HS).
References
More filters
Book

Practical Methods of Optimization

TL;DR: The aim of this book is to provide a Discussion of Constrained Optimization and its Applications to Linear Programming and Other Optimization Problems.
Book

Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)

TL;DR: In this paper, Schnabel proposed a modular system of algorithms for unconstrained minimization and nonlinear equations, based on Newton's method for solving one equation in one unknown convergence of sequences of real numbers.
Book

Numerical methods for unconstrained optimization and nonlinear equations

TL;DR: Newton's Method for Nonlinear Equations and Unconstrained Minimization and methods for solving nonlinear least-squares problems with Special Structure.
Journal ArticleDOI

Two-Point Step Size Gradient Methods

TL;DR: Etude de nouvelles methodes de descente suivant le gradient for the solution approchee du probleme de minimisation sans contrainte. as mentioned in this paper.
Related Papers (5)