scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Efficient global optimization algorithm assisted by multiple surrogate techniques

01 Jun 2013-Journal of Global Optimization (Springer US)-Vol. 56, Iss: 2, pp 669-689
TL;DR: The multiple surrogate efficient global optimization (MSEGO) algorithm is proposed, which adds several points per optimization cycle with the help of multiple surrogates, and is found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.
Abstract: Surrogate-based optimization proceeds in cycles. Each cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally analyzing a candidate solution. Algorithms that use the surrogate uncertainty estimator to guide the selection of the next sampling candidate are readily available, e.g., the efficient global optimization (EGO) algorithm. However, adding one single point at a time may not be efficient when the main concern is wall-clock time (rather than number of simulations) and simulations can run in parallel. Also, the need for uncertainty estimates limits EGO-like strategies to surrogates normally implemented with such estimates (e.g., kriging and polynomial response surface). We propose the multiple surrogate efficient global optimization (MSEGO) algorithm, which adds several points per optimization cycle with the help of multiple surrogates. We import uncertainty estimates from one surrogate to another to allow use of surrogates that do not provide them. The approach is tested on three analytic examples for nine basic surrogates including kriging, radial basis neural networks, linear Shepard, and six different instances of support vector regression. We found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.
Citations
More filters
Journal ArticleDOI
TL;DR: Two of the frequently used surrogates, radial basis functions, and Kriging are tested on a variety of test problems and guidelines for the choice of appropriate surrogate model are discussed.

421 citations

Journal ArticleDOI
TL;DR: The extent to which the use of metamodeling techniques inmultidisciplinary design optimization have evolved in the 25 years since the seminal paper on design and analysis of computer experiments is addressed.
Abstract: The use of metamodeling techniques in the design and analysis of computer experiments has progressed remarkably in the past 25 years, but how far has the field really come? This is the question addressed in this paper, namely, the extent to which the use of metamodeling techniques in multidisciplinary design optimization have evolved in the 25 years since the seminal paper on design and analysis of computer experiments by Sacks et al. (“Design and Analysis of Computer Experiments,” Statistical Science, Vol. 4, No. 4, 1989, pp. 409–435). Rather than a technical review of the entire body of metamodeling literature, the focus is on the evolution and motivation for advancements in metamodeling with some discussion on the research itself; not surprisingly, much of the current research motivation is the same as it was in the past. Based on current research thrusts in the field, multifidelity approximations and ensembles (i.e., sets) of metamodels, as well as the availability of metamodels within commercial soft...

330 citations


Additional excerpts

  • ...Another two ways of using multiple surrogates in sequential sampling might be through 1) blind kriging [104,189], which can be seen as an ensemble, and 2) the use the pool of surrogates to providemultiple points per cycle of the EGO algorithm, such as in [190]....

    [...]

Journal ArticleDOI
TL;DR: A comprehensive review of the important studies on design optimization for structural crashworthiness and energy absorption is provided in this article, where the authors provide some conclusions and recommendations to enable academia and industry to become more aware of the available capabilities and recent developments in design optimization.
Abstract: Optimization for structural crashworthiness and energy absorption has become an important topic of research attributable to its proven benefits to public safety and social economy. This paper provides a comprehensive review of the important studies on design optimization for structural crashworthiness and energy absorption. First, the design criteria used in crashworthiness and energy absorption are reviewed and the surrogate modeling to evaluate these criteria is discussed. Second, multiobjective optimization, optimization under uncertainties and topology optimization are reviewed from concepts, algorithms to applications in relation to crashworthiness. Third, the crashworthy structures are summarized, from generically novel structural configurations to industrial applications. Finally, some conclusions and recommendations are provided to enable academia and industry to become more aware of the available capabilities and recent developments in design optimization for structural crashworthiness and energy absorption.

295 citations

Journal ArticleDOI
TL;DR: This work provides a comprehensive and detailed literature review in terms of significant theoretical contributions, algorithmic developments, software implementations and applications for both MINLP and CDFO, and shows their individual prerequisites, formulations and applicability.

195 citations


Cites background from "Efficient global optimization algor..."

  • ...Jones (2001) compares the performance of kriging-based surrogate models to quadratic non-interpolating models for global-search optimization, while Viana et al. (2013) develop approaches using multiple surrogate predictions to locate promising new sampling points within a box-constrained region....

    [...]

References
More filters
Journal ArticleDOI
Rainer Storn1, Kenneth Price
TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Abstract: A new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented. By means of an extensive testbed it is demonstrated that the new method converges faster and with more certainty than many other acclaimed global optimization methods. The new method requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.

24,053 citations

Book
01 Jan 2000
TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Abstract: From the publisher: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory. SVMs deliver state-of-the-art performance in real-world applications such as text categorisation, hand-written character recognition, image classification, biosequences analysis, etc., and are now established as one of the standard tools for machine learning and data mining. Students will find the book both stimulating and accessible, while practitioners will be guided smoothly through the material required for a good grasp of the theory and its applications. The concepts are introduced gradually in accessible and self-contained stages, while the presentation is rigorous and thorough. Pointers to relevant literature and web sites containing software ensure that it forms an ideal starting point for further study. Equally, the book and its associated web site will guide practitioners to updated literature, new applications, and on-line software.

13,736 citations


Additional excerpts

  • ..., radial basis neural networks [26,27], linear Shepard interpolation [28,29], and support vector regression [30, 31]....

    [...]

Journal ArticleDOI
TL;DR: This tutorial gives an overview of the basic ideas underlying Support Vector (SV) machines for function estimation, and includes a summary of currently used algorithms for training SV machines, covering both the quadratic programming part and advanced methods for dealing with large datasets.
Abstract: In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing with large datasets. Finally, we mention some modifications and extensions that have been applied to the standard SV algorithm, and discuss the aspect of regularization from a SV perspective.

10,696 citations


Additional excerpts

  • ..., radial basis neural networks [26,27], linear Shepard interpolation [28,29], and support vector regression [30, 31]....

    [...]

Journal ArticleDOI
TL;DR: In this paper, two sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies and they are shown to be improvements over simple sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.
Abstract: Two types of sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies. These plans are shown to be improvements over simple random sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.

8,328 citations

Journal ArticleDOI
TL;DR: This paper introduces the reader to a response surface methodology that is especially good at modeling the nonlinear, multimodal functions that often occur in engineering and shows how these approximating functions can be used to construct an efficient global optimization algorithm with a credible stopping rule.
Abstract: In many engineering optimization problems, the number of function evaluations is severely limited by time or cost. These problems pose a special challenge to the field of global optimization, since existing methods often require more function evaluations than can be comfortably afforded. One way to address this challenge is to fit response surfaces to data collected by evaluating the objective and constraint functions at a few points. These surfaces can then be used for visualization, tradeoff analysis, and optimization. In this paper, we introduce the reader to a response surface methodology that is especially good at modeling the nonlinear, multimodal functions that often occur in engineering. We then show how these approximating functions can be used to construct an efficient global optimization algorithm with a credible stopping rule. The key to using response surfaces for global optimization lies in balancing the need to exploit the approximating surface (by sampling where it is minimized) with the need to improve the approximation (by sampling where prediction error may be high). Striking this balance requires solving certain auxiliary problems which have previously been considered intractable, but we show how these computational obstacles can be overcome.

6,914 citations


"Efficient global optimization algor..." refers background or methods in this paper

  • ...Thus, the expected improvement E I (x) is the expectation of I (x) (derivation found in [7])...

    [...]

  • ...As defined by [7], the improvement at a point x is I (x) = max (yP BS − Y (x), 0) , (4)...

    [...]

  • ...For example, the efficient global optimization (EGO) algorithm [7] models the objective function as a random variable....

    [...]