Journal ArticleDOI
The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
TLDR
Results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in theNumber of gradient evaluations.Abstract:
The Barzilai and Borwein gradient method for the solution of large scale unconstrained minimization problems is considered. This method requires few storage locations and very inexpensive computations. Furthermore, it does not guarantee descent in the objective function and no line search is required. Recently, the global convergence for the convex quadratic case has been established. However, for the nonquadratic case, the method needs to be incorporated in a globalization scheme. In this work, a nonmonotone line search strategy that guarantees global convergence is combined with the Barzilai and Borwein method. This strategy is based on the nonmonotone line search technique proposed by Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707--716]. Numerical results to compare the behavior of this method with recent implementations of the conjugate gradient method are presented. These results indicate that the global Barzilai and Borwein method may allow some significant reduction in the number of line searches and also in the number of gradient evaluations.read more
Citations
More filters
Journal ArticleDOI
Descent Direction Stochastic Approximation Algorithm with Adaptive Step Sizes
TL;DR: A stochastic approximation (SA) algorithm with new adaptive step sizes for solving unconstrained minimization problems in noisy environment is proposed and numerical results support theoretical expectations and verify efficiency of the algorithm regardless of chosen search direction and noise level.
Posted Content
A Barzilai-Borwein $l_1$-Regularized Least Squares Algorithm for Compressed Sensing
TL;DR: Methodologies for solving the problem of sparse solutions to under-determined linear systems are presented as background to the method used in this work where the problem is reformulated as an unconstrained convex optimization problem.
Posted Content
Exact Spectral-Like Gradient Method for Distributed Optimization
TL;DR: In this article, the authors proposed a distributed spectral gradient method (DSG) for unconstrained distributed optimization problems where nodes constitute an arbitrary connected network and collaboratively minimize the sum of their local convex cost functions.
Posted Content
Gradient Method for Optimization on Riemannian Manifolds with Lower Bounded Curvature
TL;DR: In this paper, the gradient method for minimizing a differentiable convex function on Riemannian manifolds with lower bounded sectional curvature is analyzed with three different finite procedures for determining the stepsize.
Posted Content
Sparse Inverse Covariance Estimation via an Adaptive Gradient-Based Method
Suvrit Sra,Dongmin Kim +1 more
TL;DR: This work develops a new adaptive gradient-based method that carefully combines gradient information with an adaptive step-scaling strategy, which results in a scalable, highly competitive method that outperforms state-of-the-art competitors for large problems.
References
More filters
Book
Practical Methods of Optimization
TL;DR: The aim of this book is to provide a Discussion of Constrained Optimization and its Applications to Linear Programming and Other Optimization Problems.
Book
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
TL;DR: In this paper, Schnabel proposed a modular system of algorithms for unconstrained minimization and nonlinear equations, based on Newton's method for solving one equation in one unknown convergence of sequences of real numbers.
Book
Numerical methods for unconstrained optimization and nonlinear equations
TL;DR: Newton's Method for Nonlinear Equations and Unconstrained Minimization and methods for solving nonlinear least-squares problems with Special Structure.
Journal ArticleDOI
Two-Point Step Size Gradient Methods
TL;DR: Etude de nouvelles methodes de descente suivant le gradient for the solution approchee du probleme de minimisation sans contrainte. as mentioned in this paper.