Journal ArticleDOI
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
TLDR
Results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in theNumber of gradient evaluations.Abstract:
The Barzilai and Borwein gradient method for the solution of large scale unconstrained minimization problems is considered. This method requires few storage locations and very inexpensive computations. Furthermore, it does not guarantee descent in the objective function and no line search is required. Recently, the global convergence for the convex quadratic case has been established. However, for the nonquadratic case, the method needs to be incorporated in a globalization scheme. In this work, a nonmonotone line search strategy that guarantees global convergence is combined with the Barzilai and Borwein method. This strategy is based on the nonmonotone line search technique proposed by Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707--716]. Numerical results to compare the behavior of this method with recent implementations of the conjugate gradient method are presented. These results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in the number of gradient evaluations.read more
Citations
More filters
Journal ArticleDOI
Benchmarking large-scale distributed convex quadratic programming algorithms
TL;DR: It turns out that the alternating direction method of multipliers and the restarted version of the fast gradient method are the best methods for solving decomposable QPs in terms of the number of necessary, lower level QP solutions.
Journal ArticleDOI
A robust implementation of a sequential quadratic programming algorithm with successive error restoration
TL;DR: The purpose of this paper is to show by numerical experimentation that the sequential quadratic programming method can be stabilized substantially, and how initial and periodic scaled restarts improve the efficiency in situations with slow convergence.
Journal ArticleDOI
A Dynamical Tikhonov Regularization for Solving Ill-posed Linear Algebraic Systems
TL;DR: In this article, an adaptive Tikhonov method is proposed to solve ill-posed linear algebraic problems, which consists in building a numerical minimizing vector sequence that remains on an invariant manifold, and then the Tikhov parameter can be optimally computed at each iteration by minimizing a proper merit function.
Journal ArticleDOI
A Derivative-Free Conjugate Gradient Method and Its Global Convergence for Solving Symmetric Nonlinear Equations
TL;DR: Numerical results on some benchmark test problems show that the proposed conjugate gradient method is practically effective and gives it advantage to solve relatively large-scale problems with lower storage requirement compared to some existing methods.
Journal ArticleDOI
On the steepest descent algorithm for quadratic functions
TL;DR: A new method for estimating short steps, and a method alternating Cauchy and short steps is proposed, and the roots of a certain Chebyshev polynomial are used to further accelerate the methods.
References
More filters
Book
Practical Methods of Optimization
TL;DR: The aim of this book is to provide a Discussion of Constrained Optimization and its Applications to Linear Programming and Other Optimization Problems.
Book
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
TL;DR: In this paper, Schnabel proposed a modular system of algorithms for unconstrained minimization and nonlinear equations, based on Newton's method for solving one equation in one unknown convergence of sequences of real numbers.
Book
Numerical methods for unconstrained optimization and nonlinear equations
TL;DR: Newton's Method for Nonlinear Equations and Unconstrained Minimization and methods for solving nonlinear least-squares problems with Special Structure.
Journal ArticleDOI
Two-Point Step Size Gradient Methods
TL;DR: Etude de nouvelles methodes de descente suivant le gradient for the solution approchee du probleme de minimisation sans contrainte. as mentioned in this paper.