scispace - formally typeset
Open AccessBook

Derivative-Free and Blackbox Optimization

Charles Audet, +1 more
Reads0
Chats0
TLDR
DFO algorithms have principally fallen into one of two categories: direct search methods and modelbased methods, and researchers began mixing direct search and model-based methods to create hybrid methods with improved performance.
Abstract
Derivative-free optimization (DFO) is the mathematical study of the optimization algorithms that do not use derivatives. While a DFO algorithm was used to test one of the worlds first computers (the MANIAC in 1952), it was not until the 1990s that DFO algorithms were studied mathematically. Blackbox optimization (BBO) is the study of optimization problems where the objective function is a blackbox. That is, no analytic description of the function is available, but given an arbitrary input the blackbox returns a function value. As BBO naturally arises whenever a computer simulation is involved in an optimization problem, BBO is one of the most rapidly expanding areas of applied optimization. BBO can naturally be approached by DFO. DFO algorithms have principally fallen into one of two categories: direct search methods and modelbased methods. Direct search methods work from an incumbent solution and examine a collection of trial points; if improvement is found, then the incumbent solution is updated, otherwise a search radius parameter is decreased and a new collection of trial points is examined. Model-based methods approximate the objective function with a model function, and use the gradients or even second derivatives of the model function to help guide optimization. (Note that while DFO studies algorithms that do not use derivatives, this does not mean that the objective function is nondifferentiable – for example the objective could be a computer simulation using numerical integration.) It was not until very recently that researchers began mixing direct search and model-based methods to create hybrid methods with improved performance.

read more

Citations
More filters
Journal ArticleDOI

COCO: A Platform for Comparing Continuous Optimizers in a Black-Box Setting

TL;DR: COCO as discussed by the authors is an open source platform for comparing continuous optimizers in a black-box setting, which aims at automatizing the tedious and repetitive task of benchmarking numerical optimization algorithms to the greatest possible extent.
Journal ArticleDOI

Derivative-free optimization methods

TL;DR: A review of derivative-free methods for non-convex optimization problems is given in this paper, with an emphasis on recent developments and on unifying treatment of such problems in the non-linear optimization and machine learning literature.
Proceedings ArticleDOI

Concolic testing for deep neural networks

TL;DR: The first concolic testing approach for Deep Neural Networks (DNNs) is presented, which formalise coverage criteria for DNNs that have been studied in the literature, and develops a coherent method for performing concolicTesting to increase test coverage.
Posted Content

Concolic Testing for Deep Neural Networks.

TL;DR: In this article, the authors present the first concolic testing approach for deep neural networks (DNNs), which combines program execution and symbolic analysis to explore the execution paths of a software program.
Posted ContentDOI

Vaccine optimization for COVID-19: who to vaccinate first?

TL;DR: An age-stratified mathematical model determined optimal vaccine allocation for four different metrics (deaths, symptomatic infections, and maximum non-ICU and ICU hospitalizations) under a wide variety of assumptions and found that a vaccine with effectiveness would be enough to substantially mitigate the ongoing pandemic.
Related Papers (5)