scispace - formally typeset
Journal ArticleDOI

Efficient global optimization algorithm assisted by multiple surrogate techniques

TLDR
The multiple surrogate efficient global optimization (MSEGO) algorithm is proposed, which adds several points per optimization cycle with the help of multiple surrogates, and is found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.
Abstract
Surrogate-based optimization proceeds in cycles. Each cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally analyzing a candidate solution. Algorithms that use the surrogate uncertainty estimator to guide the selection of the next sampling candidate are readily available, e.g., the efficient global optimization (EGO) algorithm. However, adding one single point at a time may not be efficient when the main concern is wall-clock time (rather than number of simulations) and simulations can run in parallel. Also, the need for uncertainty estimates limits EGO-like strategies to surrogates normally implemented with such estimates (e.g., kriging and polynomial response surface). We propose the multiple surrogate efficient global optimization (MSEGO) algorithm, which adds several points per optimization cycle with the help of multiple surrogates. We import uncertainty estimates from one surrogate to another to allow use of surrogates that do not provide them. The approach is tested on three analytic examples for nine basic surrogates including kriging, radial basis neural networks, linear Shepard, and six different instances of support vector regression. We found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.

read more

Citations
More filters
Book ChapterDOI

Trust-Region Based Multi-objective Optimization for Low Budget Scenarios

TL;DR: This paper proposes a metamodel-based multi-objective evolutionary algorithm that adaptively maintains regions of trust in variable space to make a balance between error uncertainty and progress and aims to solve multi- objective expensive problems where it incorporates multiple trust regions corresponding to multiple non-dominated solutions.
Posted Content

Kriging metamodels and global opimization in simulation

TL;DR: This dissertation investigates several methodological questions about Kriging metamodels and their use in EGO for deterministic and random simulation models.
Journal ArticleDOI

A hybrid global optimization method based on multiple metamodels

TL;DR: A hybrid global optimization based on multiple metamodels (MMHGO) is proposed for improving the efficiency of global optimization and it not only imparts the expected improvement (EI) criterion of kriging into other metamadels but also intelligently selects appropriate metAModeling techniques to guide the search direction, thus making the search process very efficient.
Journal ArticleDOI

Hybrid meta-model-based global optimum pursuing method for expensive problems

TL;DR: Through test with six high-dimensional problems, the proposed HMGOP method shows excellent search accuracy, efficiency, and robustness and is applied in a vehicle lightweight design with 30 design variables, achieving satisfied results.
Journal ArticleDOI

Adaptive in situ model refinement for surrogate-augmented population-based optimization

TL;DR: Numerical experiments performed on multiple benchmark functions and an optimal planning problem demonstrate AMR’s ability to preserve computational efficiency of the SBO process while providing solutions of more attractive fidelity than those provisioned by a standard SBO approach.
References
More filters
Journal ArticleDOI

Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces

TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Book

An Introduction to Support Vector Machines and Other Kernel-based Learning Methods

TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Journal ArticleDOI

A tutorial on support vector regression

TL;DR: This tutorial gives an overview of the basic ideas underlying Support Vector (SV) machines for function estimation, and includes a summary of currently used algorithms for training SV machines, covering both the quadratic programming part and advanced methods for dealing with large datasets.
Journal ArticleDOI

A comparison of three methods for selecting values of input variables in the analysis of output from a computer code

TL;DR: In this paper, two sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies and they are shown to be improvements over simple sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.
Journal ArticleDOI

Efficient Global Optimization of Expensive Black-Box Functions

TL;DR: This paper introduces the reader to a response surface methodology that is especially good at modeling the nonlinear, multimodal functions that often occur in engineering and shows how these approximating functions can be used to construct an efficient global optimization algorithm with a credible stopping rule.
Related Papers (5)