Journal ArticleDOI
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
TLDR
Results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in theNumber of gradient evaluations.Abstract:
The Barzilai and Borwein gradient method for the solution of large scale unconstrained minimization problems is considered. This method requires few storage locations and very inexpensive computations. Furthermore, it does not guarantee descent in the objective function and no line search is required. Recently, the global convergence for the convex quadratic case has been established. However, for the nonquadratic case, the method needs to be incorporated in a globalization scheme. In this work, a nonmonotone line search strategy that guarantees global convergence is combined with the Barzilai and Borwein method. This strategy is based on the nonmonotone line search technique proposed by Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707--716]. Numerical results to compare the behavior of this method with recent implementations of the conjugate gradient method are presented. These results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in the number of gradient evaluations.read more
Citations
More filters
Journal ArticleDOI
Estimation of the Optical Constants and the Thickness of Thin Films Using Unconstrained Optimization
TL;DR: This paper introduces an unconstrained formulation of the nonlinear programming model and solves the estimation problem using a method based on repeated calls to a recently introduced unconStrained minimization algorithm.
Journal ArticleDOI
Distributed Basis Pursuit
TL;DR: The algorithm, named D-ADMM, is a decentralized implementation of the alternating direction method of multi- pliers, and it is shown through numerical simulation that the algorithm requires considerably less communications between the nodes than the state-of-the-art algorithms.
Journal ArticleDOI
The cyclic Barzilai-–Borwein method for unconstrained optimization
TL;DR: Numerical evidence indicates that when m > n/2 3, where n is the problem dimension, CBB is locally superlinearly convergent, and it is proved that the convergence rate is no better than linear, in general.
Journal ArticleDOI
BB: An R Package for Solving a Large System of Nonlinear Equations and for Optimizing a High-Dimensional Nonlinear Objective Function
Ravi Varadhan,Paul Gilbert +1 more
TL;DR: R package BB is discussed, in particular, its capabilities for solving a nonlinear system of equations, and the utility of these functions for solving large systems of nonlinear equations, smooth, nonlinear estimating equations in statistical modeling, and non-smooth estimating equations arising in rank-based regression modeling of censored failure time data.
Journal ArticleDOI
Gradient Methods with Adaptive Step-Sizes
Bin Zhou,Li Gao,Yu-Hong Dai +2 more
TL;DR: Although the new algorithms are still linearly convergent in the quadratic case, numerical experiments indicate that they compare favorably with the BB method and some other efficient gradient methods.
References
More filters
Book
Practical Methods of Optimization
TL;DR: The aim of this book is to provide a Discussion of Constrained Optimization and its Applications to Linear Programming and Other Optimization Problems.
Book
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
TL;DR: In this paper, Schnabel proposed a modular system of algorithms for unconstrained minimization and nonlinear equations, based on Newton's method for solving one equation in one unknown convergence of sequences of real numbers.
Book
Numerical methods for unconstrained optimization and nonlinear equations
TL;DR: Newton's Method for Nonlinear Equations and Unconstrained Minimization and methods for solving nonlinear least-squares problems with Special Structure.
Journal ArticleDOI
Two-Point Step Size Gradient Methods
TL;DR: Etude de nouvelles methodes de descente suivant le gradient for the solution approchee du probleme de minimisation sans contrainte. as mentioned in this paper.