scispace - formally typeset
Open AccessBook

Engineering Design via Surrogate Modelling: A Practical Guide

Reads0
Chats0
TLDR
This chapter discusses the design and exploration of a Surrogate-based kriging model, and some of the techniques used in that process, as well as some new approaches to designing models based on the data presented.
Abstract
Preface. About the Authors. Foreword. Prologue. Part I: Fundamentals. 1. Sampling Plans. 1.1 The 'Curse of Dimensionality' and How to Avoid It. 1.2 Physical versus Computational Experiments. 1.3 Designing Preliminary Experiments (Screening). 1.3.1 Estimating the Distribution of Elementary Effects. 1.4 Designing a Sampling Plan. 1.4.1 Stratification. 1.4.2 Latin Squares and Random Latin Hypercubes. 1.4.3 Space-filling Latin Hypercubes. 1.4.4 Space-filling Subsets. 1.5 A Note on Harmonic Responses. 1.6 Some Pointers for Further Reading. References. 2. Constructing a Surrogate. 2.1 The Modelling Process. 2.1.1 Stage One: Preparing the Data and Choosing a Modelling Approach. 2.1.2 Stage Two: Parameter Estimation and Training. 2.1.3 Stage Three: Model Testing. 2.2 Polynomial Models. 2.2.1 Example One: Aerofoil Drag. 2.2.2 Example Two: a Multimodal Testcase. 2.2.3 What About the k -variable Case? 2.3 Radial Basis Function Models. 2.3.1 Fitting Noise-Free Data. 2.3.2 Radial Basis Function Models of Noisy Data. 2.4 Kriging. 2.4.1 Building the Kriging Model. 2.4.2 Kriging Prediction. 2.5 Support Vector Regression. 2.5.1 The Support Vector Predictor. 2.5.2 The Kernel Trick. 2.5.3 Finding the Support Vectors. 2.5.4 Finding . 2.5.5 Choosing C and epsilon. 2.5.6 Computing epsilon : v -SVR 71. 2.6 The Big(ger) Picture. References. 3. Exploring and Exploiting a Surrogate. 3.1 Searching the Surrogate. 3.2 Infill Criteria. 3.2.1 Prediction Based Exploitation. 3.2.2 Error Based Exploration. 3.2.3 Balanced Exploitation and Exploration. 3.2.4 Conditional Likelihood Approaches. 3.2.5 Other Methods. 3.3 Managing a Surrogate Based Optimization Process. 3.3.1 Which Surrogate for What Use? 3.3.2 How Many Sample Plan and Infill Points? 3.3.3 Convergence Criteria. 3.3.4 Search of the Vibration Isolator Geometry Feasibility Using Kriging Goal Seeking. References. Part II: Advanced Concepts. 4. Visualization. 4.1 Matrices of Contour Plots. 4.2 Nested Dimensions. Reference. 5. Constraints. 5.1 Satisfaction of Constraints by Construction. 5.2 Penalty Functions. 5.3 Example Constrained Problem. 5.3.1 Using a Kriging Model of the Constraint Function. 5.3.2 Using a Kriging Model of the Objective Function. 5.4 Expected Improvement Based Approaches. 5.4.1 Expected Improvement With Simple Penalty Function. 5.4.2 Constrained Expected Improvement. 5.5 Missing Data. 5.5.1 Imputing Data for Infeasible Designs. 5.6 Design of a Helical Compression Spring Using Constrained Expected Improvement. 5.7 Summary. References. 6. Infill Criteria With Noisy Data. 6.1 Regressing Kriging. 6.2 Searching the Regression Model. 6.2.1 Re-Interpolation. 6.2.2 Re-Interpolation With Conditional Likelihood Approaches. 6.3 A Note on Matrix Ill-Conditioning. 6.4 Summary. References. 7. Exploiting Gradient Information. 7.1 Obtaining Gradients. 7.1.1 Finite Differencing. 7.1.2 Complex Step Approximation. 7.1.3 Adjoint Methods and Algorithmic Differentiation. 7.2 Gradient-enhanced Modelling. 7.3 Hessian-enhanced Modelling. 7.4 Summary. References. 8. Multi-fidelity Analysis. 8.1 Co-Kriging. 8.2 One-variable Demonstration. 8.3 Choosing X c and X e . 8.4 Summary. References. 9. Multiple Design Objectives. 9.1 Pareto Optimization. 9.2 Multi-objective Expected Improvement. 9.3 Design of the Nowacki Cantilever Beam Using Multi-objective, Constrained Expected Improvement. 9.4 Design of a Helical Compression Spring Using Multi-objective, Constrained Expected Improvement. 9.5 Summary. References. Appendix: Example Problems. A.1 One-Variable Test Function. A.2 Branin Test Function. A.3 Aerofoil Design. A.4 The Nowacki Beam. A.5 Multi-objective, Constrained Optimal Design of a Helical Compression Spring. A.6 Novel Passive Vibration Isolator Feasibility. References. Index.

read more

Citations
More filters
Journal ArticleDOI

Recent advances in surrogate-based optimization

TL;DR: The present state of the art of constructing surrogate models and their use in optimization strategies is reviewed and extensive use of pictorial examples are made to give guidance as to each method's strengths and weaknesses.
MonographDOI

Engineering Design via Surrogate Modelling

TL;DR: In this article, the authors propose a sampling approach to estimate the distribution of elementary effects and then use this information to construct a kriging model of the data set, which is then used for regression.
Journal ArticleDOI

Surrogate-assisted evolutionary computation: Recent advances and future challenges

TL;DR: This paper provides a concise overview of the history and recent developments in surrogate-assisted evolutionary computation and suggests a few future trends in this research area.
Posted Content

A Tutorial on Bayesian Optimization

TL;DR: This tutorial describes how Bayesian optimization works, including Gaussian process regression and three common acquisition functions: expected improvement, entropy search, and knowledge gradient, and provides a generalization of expected improvement to noisy evaluations beyond the noise-free setting where it is more commonly applied.
Journal ArticleDOI

Sensitivity analysis of environmental models

TL;DR: This paper presents an overview of SA and its link to uncertainty analysis, model calibration and evaluation, robust decision-making, and provides practical guidelines by developing a workflow for the application of SA.
References
More filters
Journal ArticleDOI

Optimization by Simulated Annealing

TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Journal ArticleDOI

A simplex method for function minimization

TL;DR: A method is described for the minimization of a function of n variables, which depends on the comparison of function values at the (n 41) vertices of a general simplex, followed by the replacement of the vertex with the highest value by another point.
Journal ArticleDOI

A tutorial on support vector regression

TL;DR: This tutorial gives an overview of the basic ideas underlying Support Vector (SV) machines for function estimation, and includes a summary of currently used algorithms for training SV machines, covering both the quadratic programming part and advanced methods for dealing with large datasets.
Journal ArticleDOI

A comparison of three methods for selecting values of input variables in the analysis of output from a computer code

TL;DR: In this paper, two sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies and they are shown to be improvements over simple sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.
Journal ArticleDOI

Ridge regression: biased estimation for nonorthogonal problems

TL;DR: In this paper, an estimation procedure based on adding small positive quantities to the diagonal of X′X was proposed, which is a method for showing in two dimensions the effects of nonorthogonality.
Related Papers (5)