Journal ArticleDOI
Efficient global optimization algorithm assisted by multiple surrogate techniques
TLDR
The multiple surrogate efficient global optimization (MSEGO) algorithm is proposed, which adds several points per optimization cycle with the help of multiple surrogates, and is found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.Abstract:
Surrogate-based optimization proceeds in cycles. Each cycle consists of analyzing a number of designs, fitting a surrogate, performing optimization based on the surrogate, and finally analyzing a candidate solution. Algorithms that use the surrogate uncertainty estimator to guide the selection of the next sampling candidate are readily available, e.g., the efficient global optimization (EGO) algorithm. However, adding one single point at a time may not be efficient when the main concern is wall-clock time (rather than number of simulations) and simulations can run in parallel. Also, the need for uncertainty estimates limits EGO-like strategies to surrogates normally implemented with such estimates (e.g., kriging and polynomial response surface). We propose the multiple surrogate efficient global optimization (MSEGO) algorithm, which adds several points per optimization cycle with the help of multiple surrogates. We import uncertainty estimates from one surrogate to another to allow use of surrogates that do not provide them. The approach is tested on three analytic examples for nine basic surrogates including kriging, radial basis neural networks, linear Shepard, and six different instances of support vector regression. We found that MSEGO works well even with imported uncertainty estimates, delivering better results in a fraction of the optimization cycles needed by EGO.read more
Citations
More filters
Journal ArticleDOI
Advances in surrogate based modeling, feasibility analysis, and optimization: A review
TL;DR: Two of the frequently used surrogates, radial basis functions, and Kriging are tested on a variety of test problems and guidelines for the choice of appropriate surrogate model are discussed.
Journal ArticleDOI
Metamodeling in Multidisciplinary Design Optimization: How Far Have We Really Come?
TL;DR: The extent to which the use of metamodeling techniques inmultidisciplinary design optimization have evolved in the 25 years since the seminal paper on design and analysis of computer experiments is addressed.
Journal ArticleDOI
On design optimization for structural crashworthiness and its state of the art
TL;DR: A comprehensive review of the important studies on design optimization for structural crashworthiness and energy absorption is provided in this article, where the authors provide some conclusions and recommendations to enable academia and industry to become more aware of the available capabilities and recent developments in design optimization.
Journal ArticleDOI
Global optimization advances in Mixed-Integer Nonlinear Programming, MINLP, and Constrained Derivative-Free Optimization, CDFO
TL;DR: This work provides a comprehensive and detailed literature review in terms of significant theoretical contributions, algorithmic developments, software implementations and applications for both MINLP and CDFO, and shows their individual prerequisites, formulations and applicability.
References
More filters
Support Vector Machines for Classification and Regression
TL;DR: The Structural Risk Minimization (SRM) as discussed by the authors principle has been shown to be superior to traditional empirical risk minimization (ERM) principle employed by conventional neural networks, as opposed to ERM which minimizes the error on the training data.
Journal ArticleDOI
A Taxonomy of Global Optimization Methods Based on Response Surfaces
TL;DR: This paper presents a taxonomy of existing approaches for using response surfaces for global optimization, illustrating each method with a simple numerical example that brings out its advantages and disadvantages.
Journal ArticleDOI
Recent advances in surrogate-based optimization
TL;DR: The present state of the art of constructing surrogate models and their use in optimization strategies is reviewed and extensive use of pictorial examples are made to give guidance as to each method's strengths and weaknesses.
Journal ArticleDOI
Support Vector Machines for classification and regression
TL;DR: The increasing interest in Support Vector Machines (SVMs) over the past 15 years is described, including its application to multivariate calibration, and why it is useful when there are outliers and non-linearities.
Journal ArticleDOI
Practical selection of SVM parameters and noise estimation for SVM regression
Vladimir Cherkassky,Yunqian Ma +1 more
TL;DR: This work describes a new analytical prescription for setting the value of insensitive zone epsilon, as a function of training sample size, and compares generalization performance of SVM regression under sparse sample settings with regression using 'least-modulus' loss (epsilon=0) and standard squared loss.