scispace - formally typeset
Search or ask a question
Topic

Line search

About: Line search is a research topic. Over the lifetime, 2459 publications have been published within this topic receiving 49957 citations. The topic is also known as: line search method & method of line search.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper analyzes several new methods for solving optimization problems with the objective function formed as a sum of two terms, one is smooth and given by a black-box oracle, and another is a simple general convex function with known structure.
Abstract: In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is a simple general convex function with known structure. Despite the absence of good properties of the sum, such problems, both in convex and nonconvex cases, can be solved with efficiency typical for the first part of the objective. For convex problems of the above structure, we consider primal and dual variants of the gradient method (with convergence rate $$O\left({1 \over k}\right)$$ ), and an accelerated multistep version with convergence rate $$O\left({1 \over k^2}\right)$$ , where $$k$$ is the iteration counter. For nonconvex problems with this structure, we prove convergence to a point from which there is no descent direction. In contrast, we show that for general nonsmooth, nonconvex problems, even resolving the question of whether a descent direction exists from a point is NP-hard. For all methods, we suggest some efficient “line search” procedures and show that the additional computational work necessary for estimating the unknown problem class parameters can only multiply the complexity of each iteration by a small constant factor. We present also the results of preliminary computational experiments, which confirm the superiority of the accelerated scheme.

1,444 citations

Posted Content
TL;DR: This paper analyzes several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and another is general but simple and its structure is known.
Abstract: In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and another is general but simple and its structure is known. Despite to the bad properties of the sum, such problems, both in convex and nonconvex cases, can be solved with eciency typical for the good part of the objective. For convex problems of the above structure, we consider primal and dual variants of the gradient method (converge as O ‡ 1 k · ), and an accelerated multistep version with convergence rate O ‡ 1 k2 · , where k is the iteration counter. For all methods, we suggest some ecient “line search” procedures and show that the additional computational work necessary for estimating the unknown problem class parameters can only multiply the complexity of each iteration by a small constant factor. We present also the results of preliminary computational experiments, which confirm the superiority of the accelerated scheme.

1,338 citations

Journal ArticleDOI
TL;DR: In this paper, a nonmonotone steplength selection rule for Newton's method is proposed, which can be viewed as a generalization of Armijo's rule.
Abstract: In this paper a nonmonotone steplength selection rule for Newton’s method is proposed, which can be viewed as a generalization of Armijo’s rule. Numerical results are reported which indicate that the proposed technique may allow a considerable saving both in the number of line searches and in the number of function evaluations.

1,098 citations

Journal ArticleDOI
TL;DR: An interior-point method for nonlinear programming that enjoys the flexibility of switching between a line search method that computes steps by factoring the primal-dual equations and a trust region method that uses a conjugate gradient iteration.
Abstract: An interior-point method for nonlinear programming is presented It enjoys the flexibility of switching between a line search method that computes steps by factoring the primal-dual equations and a trust region method that uses a conjugate gradient iteration Steps computed by direct factorization are always tried first, but if they are deemed ineffective, a trust region iteration that guarantees progress toward stationarity is invoked To demonstrate its effectiveness, the algorithm is implemented in the Knitro [6,28] software package and is extensively tested on a wide selection of test problems

997 citations

Journal ArticleDOI
TL;DR: The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo--Lampariello--Lucidi non monotone line search that is combined with the spectral gradient choice of steplENGTH to accelerate the convergence process.
Abstract: Nonmonotone projected gradient techniques are considered for the minimization of differentiable functions on closed convex sets. The classical projected gradient schemes are extended to include a nonmonotone steplength strategy that is based on the Grippo--Lampariello--Lucidi nonmonotone line search. In particular, the nonmonotone strategy is combined with the spectral gradient choice of steplength to accelerate the convergence process. In addition to the classical projected gradient nonlinear path, the feasible spectral projected gradient is used as a search direction to avoid additional trial projections during the one-dimensional search process. Convergence properties and extensive numerical results are presented.

982 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
85% related
Iterative method
48.8K papers, 1.2M citations
83% related
Optimal control
68K papers, 1.2M citations
81% related
Partial differential equation
70.8K papers, 1.6M citations
81% related
Differential equation
88K papers, 2M citations
80% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202390
2022190
2021145
2020181
2019136
2018137