scispace - formally typeset
Journal ArticleDOI

Efficient global optimization algorithm assisted by multiple surrogate techniques

TLDR
The multiple surrogate efficient global optimization (MSEGO) algorithm is proposed, which adds several points per optimization cycle with the help of multiple surrogates, and is found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.
Abstract
Surrogate-based optimization proceeds in cycles. Each cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally analyzing a candidate solution. Algorithms that use the surrogate uncertainty estimator to guide the selection of the next sampling candidate are readily available, e.g., the efficient global optimization (EGO) algorithm. However, adding one single point at a time may not be efficient when the main concern is wall-clock time (rather than number of simulations) and simulations can run in parallel. Also, the need for uncertainty estimates limits EGO-like strategies to surrogates normally implemented with such estimates (e.g., kriging and polynomial response surface). We propose the multiple surrogate efficient global optimization (MSEGO) algorithm, which adds several points per optimization cycle with the help of multiple surrogates. We import uncertainty estimates from one surrogate to another to allow use of surrogates that do not provide them. The approach is tested on three analytic examples for nine basic surrogates including kriging, radial basis neural networks, linear Shepard, and six different instances of support vector regression. We found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.

read more

Citations
More filters
Journal ArticleDOI

Adaptive Kriging surrogate model for the optimization design of a dense non-aqueous phase liquid-contaminated groundwater remediation process

TL;DR: An integrated optimization method based on adaptive Kriging surrogate models was proposed and applied to the cost optimization of a surfactant enhanced aquifer remediation process for dense non-aqueous phase liquids (DNAPLs) and showed improved the accuracy of the surrogate model and the added samples had provided more information about the simulation model than the common samples.
Journal ArticleDOI

Metamodel-based inverse method for parameter identification: elastic–plastic damage model

TL;DR: In this paper, a metamodel-based inverse method for material parameter identification was proposed to overcome the disadvantage in computational cost of the inverse method, and the optimization procedure was executed by the use of a Kriging metamode.
Journal ArticleDOI

Multifidelity Modeling Using Nondeterministic Localized Galerkin Approach

TL;DR: In this paper, multifidelity modeling using the new nondeterministic localized Galerkin approach is introduced to address the practical challenges associated with multiple low-fidelity models.
Journal ArticleDOI

Interval uncertainty propagation by a parallel Bayesian global optimization method

TL;DR: In this article , a triple-engine parallel Bayesian global optimization (T-PBGO) method is proposed to find both the global minimum and maximum of a computationally expensive black-box function over a prescribed hyper-rectangle.
Journal ArticleDOI

A multiple surrogates based PSO algorithm

TL;DR: A multiple surrogates based PSO (MSPSO) framework is proposed, which consists of an inner loop optimization and an outer one, which is capable of converging to a good solution for the low-dimensional, non-convex and multimodal problems.
References
More filters
Journal ArticleDOI

Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces

TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Book

An Introduction to Support Vector Machines and Other Kernel-based Learning Methods

TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Journal ArticleDOI

A tutorial on support vector regression

TL;DR: This tutorial gives an overview of the basic ideas underlying Support Vector (SV) machines for function estimation, and includes a summary of currently used algorithms for training SV machines, covering both the quadratic programming part and advanced methods for dealing with large datasets.
Journal ArticleDOI

A comparison of three methods for selecting values of input variables in the analysis of output from a computer code

TL;DR: In this paper, two sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies and they are shown to be improvements over simple sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.
Journal ArticleDOI

Efficient Global Optimization of Expensive Black-Box Functions

TL;DR: This paper introduces the reader to a response surface methodology that is especially good at modeling the nonlinear, multimodal functions that often occur in engineering and shows how these approximating functions can be used to construct an efficient global optimization algorithm with a credible stopping rule.
Related Papers (5)