scispace - formally typeset
Journal ArticleDOI

A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization

Hongchao Zhang, +1 more
- 01 Apr 2004 - 
- Vol. 14, Iss: 4, pp 1043-1056
TLDR
For the L-BFGS method and the unconstrained optimization problems in the CUTE library, the new nonmonotone line search algorithm used fewer function and gradient evaluations, on average, than either the monotone or the traditional nonMonotone scheme.
Abstract
A new nonmonotone line search algorithm is proposed and analyzed. In our scheme, we require that an average of the successive function values decreases, while the traditional nonmonotone approach of Grippo, Lampariello, and Lucidi [SIAM J. Numer. Anal., 23 (1986), pp. 707--716] requires that a maximum of recent function values decreases. We prove global convergence for nonconvex, smooth functions, and R-linear convergence for strongly convex functions. For the L-BFGS method and the unconstrained optimization problems in the CUTE library, the new nonmonotone line search algorithm used fewer function and gradient evaluations, on average, than either the monotone or the traditional nonmonotone scheme.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

A feasible method for optimization with orthogonality constraints

TL;DR: The Cayley transform is applied—a Crank-Nicolson-like update scheme—to preserve the constraints and based on it, curvilinear search algorithms with lower flops are developed with high efficiency for polynomial optimization, nearest correlation matrix estimation and extreme eigenvalue problems.
Journal ArticleDOI

An efficient augmented Lagrangian method with applications to total variation minimization

TL;DR: An algorithm for solving a class of equality-constrained non-smooth optimization problems (chiefly but not necessarily convex programs) with a particular structure that effectively combines an alternating direction technique with a nonmonotone line search to minimize the augmented Lagrangian function at each iteration is proposed.
Proceedings Article

Accelerated proximal gradient methods for nonconvex programming

TL;DR: This paper is the first to provide APG-type algorithms for general nonconvex and nonsmooth problems ensuring that every accumulation point is a critical point, and the convergence rates remain O(1/k2) when the problems are convex.
Dissertation

An efficient algorithm for total variation regularization with applications to the single pixel camera and compressive sensing

Chengbo Li
TL;DR: An efficient algorithm for total variation regularization with applications to the single pixel camera and compressive sensing was proposed in this paper, which is applicable to both single and multi-pixel cameras.
Journal ArticleDOI

A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation

TL;DR: The code FPC_AS embeds this basic two-stage algorithm in a continuation (homotopy) approach by assigning a decreasing sequence of values to $\mu$ and exhibits state-of-the-art performance in terms of both its speed and its ability to recover sparse signals.
References
More filters
Journal ArticleDOI

On the limited memory BFGS method for large scale optimization

TL;DR: The numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and is better able to use additional storage to accelerate convergence, and the convergence properties are studied to prove global convergence on uniformly convex problems.
Journal ArticleDOI

Updating Quasi-Newton Matrices With Limited Storage

TL;DR: An update formula which generates matrices using information from the last m iterations, where m is any number supplied by the user, and the BFGS method is considered to be the most efficient.
Journal ArticleDOI

Two-Point Step Size Gradient Methods

TL;DR: Etude de nouvelles methodes de descente suivant le gradient for the solution approchee du probleme de minimisation sans contrainte. as mentioned in this paper.
Journal ArticleDOI

Testing Unconstrained Optimization Software

TL;DR: A relatwely large but easy-to-use collection of test functions and designed gmdelines for testing the reliability and robustness of unconstrained optimization software.
Journal ArticleDOI

A nonmonotone line search technique for Newton's method

TL;DR: In this paper, a nonmonotone steplength selection rule for Newton's method is proposed, which can be viewed as a generalization of Armijo's rule.
Related Papers (5)