scispace - formally typeset
Journal ArticleDOI

The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem

Marcos Raydan
- 01 Jan 1997 - 
- Vol. 7, Iss: 1, pp 26-33
TLDR
Results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in theNumber of gradient evaluations.
Abstract
The Barzilai and Borwein gradient method for the solution of large scale unconstrained minimization problems is considered. This method requires few storage locations and very inexpensive computations. Furthermore, it does not guarantee descent in the objective function and no line search is required. Recently, the global convergence for the convex quadratic case has been established. However, for the nonquadratic case, the method needs to be incorporated in a globalization scheme. In this work, a nonmonotone line search strategy that guarantees global convergence is combined with the Barzilai and Borwein method. This strategy is based on the nonmonotone line search technique proposed by Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707--716]. Numerical results to compare the behavior of this method with recent implementations of the conjugate gradient method are presented. These results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in the number of gradient evaluations.

read more

Citations
More filters
Journal ArticleDOI

RungeKutta-like scaling techniques for first-order methods in convex optimization

TL;DR: Inspired by the equivalence between the forward Euler scheme and the gradient descent method, the analysis of the family of RungeKutta methods is broadened to show that they enjoy a natural interpretation as first-order optimization algorithms.

Development & Implementation of Algorithms for Fast Image Reconstruction

TL;DR: This work introduces an algorithm that aims to solve the corresponding problem and accurately reconstruct the desired signal or image, based upon the Barzilai-Borwein algorithm and tailored specifically to the compressed sensing framework.
Journal ArticleDOI

The Uzawa-MBB type algorithm for nonsymmetric saddle point problems

TL;DR: The main contributions of the paper are that both of the new algorithms are constructed from optimization algorithms, use a special descent direction based on Xu et al. (2018), and combine the modified Barzilai–Borwein method with modified GLL line search strategy to solve the derived least squares problem.

A Memory Gradient Method under new Nonmonotone Line Search Technique for Unconstrained Optimization

Ke Su, +1 more
TL;DR: The global convergence of the algorithm is proved under mild conditions and numerical results show that the proposed method is efficient for given standard test problems if the authors choose a good parameter included in the method.
Journal ArticleDOI

A Schrödinger-type algorithm for solving the Schrödinger equations via Phragmén–Lindelöf inequalities

TL;DR: In this paper, the Schrodinger equations were solved via Phragmen-Lindelof inequalities under the order induced by a symmetric cone with the function involved being monotone.
References
More filters
Book

Practical Methods of Optimization

TL;DR: The aim of this book is to provide a Discussion of Constrained Optimization and its Applications to Linear Programming and Other Optimization Problems.
Book

Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)

TL;DR: In this paper, Schnabel proposed a modular system of algorithms for unconstrained minimization and nonlinear equations, based on Newton's method for solving one equation in one unknown convergence of sequences of real numbers.
Book

Numerical methods for unconstrained optimization and nonlinear equations

TL;DR: Newton's Method for Nonlinear Equations and Unconstrained Minimization and methods for solving nonlinear least-squares problems with Special Structure.
Journal ArticleDOI

Two-Point Step Size Gradient Methods

TL;DR: Etude de nouvelles methodes de descente suivant le gradient for the solution approchee du probleme de minimisation sans contrainte. as mentioned in this paper.
Related Papers (5)