scispace - formally typeset
Journal ArticleDOI

The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem

Marcos Raydan
- 01 Jan 1997 - 
- Vol. 7, Iss: 1, pp 26-33
TLDR
Results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in theNumber of gradient evaluations.
Abstract
The Barzilai and Borwein gradient method for the solution of large scale unconstrained minimization problems is considered. This method requires few storage locations and very inexpensive computations. Furthermore, it does not guarantee descent in the objective function and no line search is required. Recently, the global convergence for the convex quadratic case has been established. However, for the nonquadratic case, the method needs to be incorporated in a globalization scheme. In this work, a nonmonotone line search strategy that guarantees global convergence is combined with the Barzilai and Borwein method. This strategy is based on the nonmonotone line search technique proposed by Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707--716]. Numerical results to compare the behavior of this method with recent implementations of the conjugate gradient method are presented. These results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in the number of gradient evaluations.

read more

Citations
More filters
Journal ArticleDOI

Taking the 4D Nature of fMRI Data Into Account Promises Significant Gains in Data Completion

TL;DR: In this article, a tensor train decomposition is used to estimate missing brain voxels in fMRI data, and the proposed Riemannian nonlinear spectral conjugate gradient (RSCG) optimization method uses tensor decomposition to enable compact representations and provides efficient linear algebra operations.
Journal ArticleDOI

An extended delayed weighted gradient algorithm for solving strongly convex optimization problems

TL;DR: In this paper , the authors extended the delayed weighted gradient method (DWGM) for solving strictly convex non-quadratic minimization problems while keeping a low computational cost per iteration.
Journal ArticleDOI

Regularized graph cuts based discrete tomography reconstruction methods

TL;DR: In this article, the shape circularity was introduced as a new regularization and incorporated in a graph cuts based computed tomography reconstruction approach, thus introducing a new energy-minimization based reconstruction algorithm for binary tomography.
Journal ArticleDOI

Alternating Direction Multiplier Method for Matrix l2,1-Norm Optimization in Multitask Feature Learning Problems

TL;DR: An alternating direction multiplier method combined with the spectral gradient method is proposed for solving the matrix - norm optimization problem involved with multitask feature learning.
Proceedings ArticleDOI

Unsupervised multiobjective design for weighted median filters using genetic algorithm

TL;DR: The design of WMF is formulated as a multi-objective optimization problem that treats the preservation performance and the restoration performance as trade-off functions and obtains a wide variety of filters that have the high preservation performance or the high restoration performance at one search process.
References
More filters
Book

Practical Methods of Optimization

TL;DR: The aim of this book is to provide a Discussion of Constrained Optimization and its Applications to Linear Programming and Other Optimization Problems.
Book

Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)

TL;DR: In this paper, Schnabel proposed a modular system of algorithms for unconstrained minimization and nonlinear equations, based on Newton's method for solving one equation in one unknown convergence of sequences of real numbers.
Book

Numerical methods for unconstrained optimization and nonlinear equations

TL;DR: Newton's Method for Nonlinear Equations and Unconstrained Minimization and methods for solving nonlinear least-squares problems with Special Structure.
Journal ArticleDOI

Two-Point Step Size Gradient Methods

TL;DR: Etude de nouvelles methodes de descente suivant le gradient for the solution approchee du probleme de minimisation sans contrainte. as mentioned in this paper.
Related Papers (5)