scispace - formally typeset
Open AccessJournal ArticleDOI

Global Convergence of General Derivative-Free Trust-Region Algorithms to First- and Second-Order Critical Points

Andrew R. Conn, +2 more
- 01 Mar 2009 - 
- Vol. 20, Iss: 1, pp 387-415
Reads0
Chats0
TLDR
This paper proves global convergence for first- and second-order stationary points of a class of derivative-free trust-region methods for unconstrained optimization based on the sequential minimization of quadratic models built from evaluating the objective function at sample sets.
Abstract
In this paper we prove global convergence for first- and second-order stationary points of a class of derivative-free trust-region methods for unconstrained optimization. These methods are based on the sequential minimization of quadratic (or linear) models built from evaluating the objective function at sample sets. The derivative-free models are required to satisfy Taylor-type bounds, but, apart from that, the analysis is independent of the sampling techniques. A number of new issues are addressed, including global convergence when acceptance of iterates is based on simple decrease of the objective function, trust-region radius maintenance at the criticality step, and global convergence for second-order critical points.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Derivative-free optimization: a review of algorithms and comparison of software implementations

TL;DR: It is found that the ability of all these solvers to obtain good solutions diminishes with increasing problem size, and TomLAB/MULTIMIN, TOMLAB/GLCCLUSTER, MCS and TOMLab/LGO are better, on average, than other derivative-free solvers in terms of solution quality within 2,500 function evaluations.
Journal ArticleDOI

Survey of Multifidelity Methods in Uncertainty Propagation, Inference, and Optimization

TL;DR: In many situations across computational science and engineering, multiple computational models are available that describe a system of interest as discussed by the authors, and these different models have varying evaluation costs, i.e.
Journal ArticleDOI

Benchmarking Derivative-Free Optimization Algorithms

TL;DR: This work uses performance and data profiles, together with a convergence test that measures the decrease in function value, to analyze the performance of three solvers on sets of smooth, noisy, and piecewise-smooth problems.
Journal ArticleDOI

Recent advances in trust region algorithms

TL;DR: Recent results on trust region methods for unconstrained optimization, constrained optimization, nonlinear equations and nonlinear least squares, nonsmooth optimization and optimization without derivatives are reviewed.
Journal ArticleDOI

Derivative-free optimization methods

TL;DR: A review of derivative-free methods for non-convex optimization problems is given in this paper, with an emphasis on recent developments and on unifying treatment of such problems in the non-linear optimization and machine learning literature.
References
More filters
Book

Matrix Analysis

TL;DR: In this article, the authors present results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrate their importance in a variety of applications, such as linear algebra and matrix theory.
Book

Numerical Optimization

TL;DR: Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization, responding to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems.
Book

Trust Region Methods

TL;DR: This chapter discusses Trust-Region Mewthods for General Constained Optimization and Systems of Nonlinear Equations and Nonlinear Fitting, and some of the methods used in this chapter dealt with these systems.
Book

Introduction to derivative-free optimization

TL;DR: This book explains how sampling and model techniques are used in derivative-free methods and how these methods are designed to efficiently and rigorously solve optimization problems, in the first contemporary comprehensive treatment of optimization without derivatives.
Journal ArticleDOI

Direct search algorithms for optimization calculations

TL;DR: Line search methods, the restriction of vectors of variables to discrete grids, the use of geometric simplices, conjugate direction procedures, trust region algorithms that form linear or quadratic approximations to the objective function, and simulated annealing are addressed.
Frequently Asked Questions (1)
Q1. What have the authors contributed in "Global convergence of general derivative-free trust-region algorithms to first and second order critical points" ?

In this paper the authors prove global convergence for first and second-order stationary points of a class of derivative-free trust-region methods for unconstrained optimization.