Journal ArticleDOI
Efficient global optimization algorithm assisted by multiple surrogate techniques
TLDR
The multiple surrogate efficient global optimization (MSEGO) algorithm is proposed, which adds several points per optimization cycle with the help of multiple surrogates, and is found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.Abstract:
Surrogate-based optimization proceeds in cycles. Each cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally analyzing a candidate solution. Algorithms that use the surrogate uncertainty estimator to guide the selection of the next sampling candidate are readily available, e.g., the efficient global optimization (EGO) algorithm. However, adding one single point at a time may not be efficient when the main concern is wall-clock time (rather than number of simulations) and simulations can run in parallel. Also, the need for uncertainty estimates limits EGO-like strategies to surrogates normally implemented with such estimates (e.g., kriging and polynomial response surface). We propose the multiple surrogate efficient global optimization (MSEGO) algorithm, which adds several points per optimization cycle with the help of multiple surrogates. We import uncertainty estimates from one surrogate to another to allow use of surrogates that do not provide them. The approach is tested on three analytic examples for nine basic surrogates including kriging, radial basis neural networks, linear Shepard, and six different instances of support vector regression. We found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.read more
Citations
More filters
Journal ArticleDOI
Ensemble of metamodels: extensions of the least squares approach to efficient global optimization
TL;DR: LSEGO has shown to be a feasible option to drive EGO with ensemble of metamodels as well as for constrained problems, and it is not restricted to kriging and to a single infill point per optimization cycle.
Journal ArticleDOI
Multi-fidelity global optimization using a data-mining strategy for computationally intensive black-box problems
Jie Liu,Huachao Dong,Peng Wang +2 more
TL;DR: Three versions of MFGO were verified by comparing with five well-known methods on eight benchmark cases and one engineering problem, which performed superior computational efficiency and robustness.
Journal ArticleDOI
Nondeterministic Kriging for Engineering Design Exploration
TL;DR: The nondeterministic kriging (NDK) method is proposed, aiming for the applications of engineering design exploration, especially when only a limited number of random samples is avail...
Journal ArticleDOI
An Efficient Kriging-Based Constrained Optimization Algorithm by Global and Local Sampling in Feasible Region
TL;DR: The results indicate that the sampling efficiency of EKCO is higher than or comparable with that of the recently published algorithms while maintaining the high accuracy of the optimal solution, and the adaptive ability of the proposed algorithm also be validated.
Journal ArticleDOI
On efficient global optimization via universal Kriging surrogate models
TL;DR: The results show that the proper choice for the trend function through automatic feature selection can improve the optimization performance of UK-EGO relative to EGO, and although some variants of UK are not as globally accurate as the ordinary Kriging (OK), they can still identify better-optimized solutions due to the addition of the trendfunction, which helps the optimizer locate the global optimum.
References
More filters
Journal ArticleDOI
Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces
Rainer Storn,Kenneth Price +1 more
TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Book
An Introduction to Support Vector Machines and Other Kernel-based Learning Methods
TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Journal ArticleDOI
A tutorial on support vector regression
TL;DR: This tutorial gives an overview of the basic ideas underlying Support Vector (SV) machines for function estimation, and includes a summary of currently used algorithms for training SV machines, covering both the quadratic programming part and advanced methods for dealing with large datasets.
Journal ArticleDOI
A comparison of three methods for selecting values of input variables in the analysis of output from a computer code
TL;DR: In this paper, two sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies and they are shown to be improvements over simple sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.
Journal ArticleDOI
Efficient Global Optimization of Expensive Black-Box Functions
TL;DR: This paper introduces the reader to a response surface methodology that is especially good at modeling the nonlinear, multimodal functions that often occur in engineering and shows how these approximating functions can be used to construct an efficient global optimization algorithm with a credible stopping rule.