scispace - formally typeset
Search or ask a question
Author

Alexander I. J. Forrester

Other affiliations: Rolls-Royce Holdings
Bio: Alexander I. J. Forrester is an academic researcher from University of Southampton. The author has contributed to research in topics: Kriging & Surrogate model. The author has an hindex of 22, co-authored 62 publications receiving 6927 citations. Previous affiliations of Alexander I. J. Forrester include Rolls-Royce Holdings.


Papers
More filters
Book
02 Sep 2008
TL;DR: This chapter discusses the design and exploration of a Surrogate-based kriging model, and some of the techniques used in that process, as well as some new approaches to designing models based on the data presented.
Abstract: Preface. About the Authors. Foreword. Prologue. Part I: Fundamentals. 1. Sampling Plans. 1.1 The 'Curse of Dimensionality' and How to Avoid It. 1.2 Physical versus Computational Experiments. 1.3 Designing Preliminary Experiments (Screening). 1.3.1 Estimating the Distribution of Elementary Effects. 1.4 Designing a Sampling Plan. 1.4.1 Stratification. 1.4.2 Latin Squares and Random Latin Hypercubes. 1.4.3 Space-filling Latin Hypercubes. 1.4.4 Space-filling Subsets. 1.5 A Note on Harmonic Responses. 1.6 Some Pointers for Further Reading. References. 2. Constructing a Surrogate. 2.1 The Modelling Process. 2.1.1 Stage One: Preparing the Data and Choosing a Modelling Approach. 2.1.2 Stage Two: Parameter Estimation and Training. 2.1.3 Stage Three: Model Testing. 2.2 Polynomial Models. 2.2.1 Example One: Aerofoil Drag. 2.2.2 Example Two: a Multimodal Testcase. 2.2.3 What About the k -variable Case? 2.3 Radial Basis Function Models. 2.3.1 Fitting Noise-Free Data. 2.3.2 Radial Basis Function Models of Noisy Data. 2.4 Kriging. 2.4.1 Building the Kriging Model. 2.4.2 Kriging Prediction. 2.5 Support Vector Regression. 2.5.1 The Support Vector Predictor. 2.5.2 The Kernel Trick. 2.5.3 Finding the Support Vectors. 2.5.4 Finding . 2.5.5 Choosing C and epsilon. 2.5.6 Computing epsilon : v -SVR 71. 2.6 The Big(ger) Picture. References. 3. Exploring and Exploiting a Surrogate. 3.1 Searching the Surrogate. 3.2 Infill Criteria. 3.2.1 Prediction Based Exploitation. 3.2.2 Error Based Exploration. 3.2.3 Balanced Exploitation and Exploration. 3.2.4 Conditional Likelihood Approaches. 3.2.5 Other Methods. 3.3 Managing a Surrogate Based Optimization Process. 3.3.1 Which Surrogate for What Use? 3.3.2 How Many Sample Plan and Infill Points? 3.3.3 Convergence Criteria. 3.3.4 Search of the Vibration Isolator Geometry Feasibility Using Kriging Goal Seeking. References. Part II: Advanced Concepts. 4. Visualization. 4.1 Matrices of Contour Plots. 4.2 Nested Dimensions. Reference. 5. Constraints. 5.1 Satisfaction of Constraints by Construction. 5.2 Penalty Functions. 5.3 Example Constrained Problem. 5.3.1 Using a Kriging Model of the Constraint Function. 5.3.2 Using a Kriging Model of the Objective Function. 5.4 Expected Improvement Based Approaches. 5.4.1 Expected Improvement With Simple Penalty Function. 5.4.2 Constrained Expected Improvement. 5.5 Missing Data. 5.5.1 Imputing Data for Infeasible Designs. 5.6 Design of a Helical Compression Spring Using Constrained Expected Improvement. 5.7 Summary. References. 6. Infill Criteria With Noisy Data. 6.1 Regressing Kriging. 6.2 Searching the Regression Model. 6.2.1 Re-Interpolation. 6.2.2 Re-Interpolation With Conditional Likelihood Approaches. 6.3 A Note on Matrix Ill-Conditioning. 6.4 Summary. References. 7. Exploiting Gradient Information. 7.1 Obtaining Gradients. 7.1.1 Finite Differencing. 7.1.2 Complex Step Approximation. 7.1.3 Adjoint Methods and Algorithmic Differentiation. 7.2 Gradient-enhanced Modelling. 7.3 Hessian-enhanced Modelling. 7.4 Summary. References. 8. Multi-fidelity Analysis. 8.1 Co-Kriging. 8.2 One-variable Demonstration. 8.3 Choosing X c and X e . 8.4 Summary. References. 9. Multiple Design Objectives. 9.1 Pareto Optimization. 9.2 Multi-objective Expected Improvement. 9.3 Design of the Nowacki Cantilever Beam Using Multi-objective, Constrained Expected Improvement. 9.4 Design of a Helical Compression Spring Using Multi-objective, Constrained Expected Improvement. 9.5 Summary. References. Appendix: Example Problems. A.1 One-Variable Test Function. A.2 Branin Test Function. A.3 Aerofoil Design. A.4 The Nowacki Beam. A.5 Multi-objective, Constrained Optimal Design of a Helical Compression Spring. A.6 Novel Passive Vibration Isolator Feasibility. References. Index.

2,335 citations

Journal ArticleDOI
TL;DR: The present state of the art of constructing surrogate models and their use in optimization strategies is reviewed and extensive use of pictorial examples are made to give guidance as to each method's strengths and weaknesses.

1,919 citations

MonographDOI
18 Jul 2008
TL;DR: In this article, the authors propose a sampling approach to estimate the distribution of elementary effects and then use this information to construct a kriging model of the data set, which is then used for regression.
Abstract: Preface. About the Authors. Foreword. Prologue. Part I: Fundamentals. 1. Sampling Plans. 1.1 The 'Curse of Dimensionality' and How to Avoid It. 1.2 Physical versus Computational Experiments. 1.3 Designing Preliminary Experiments (Screening). 1.3.1 Estimating the Distribution of Elementary Effects. 1.4 Designing a Sampling Plan. 1.4.1 Stratification. 1.4.2 Latin Squares and Random Latin Hypercubes. 1.4.3 Space-filling Latin Hypercubes. 1.4.4 Space-filling Subsets. 1.5 A Note on Harmonic Responses. 1.6 Some Pointers for Further Reading. References. 2. Constructing a Surrogate. 2.1 The Modelling Process. 2.1.1 Stage One: Preparing the Data and Choosing a Modelling Approach. 2.1.2 Stage Two: Parameter Estimation and Training. 2.1.3 Stage Three: Model Testing. 2.2 Polynomial Models. 2.2.1 Example One: Aerofoil Drag. 2.2.2 Example Two: a Multimodal Testcase. 2.2.3 What About the k -variable Case? 2.3 Radial Basis Function Models. 2.3.1 Fitting Noise-Free Data. 2.3.2 Radial Basis Function Models of Noisy Data. 2.4 Kriging. 2.4.1 Building the Kriging Model. 2.4.2 Kriging Prediction. 2.5 Support Vector Regression. 2.5.1 The Support Vector Predictor. 2.5.2 The Kernel Trick. 2.5.3 Finding the Support Vectors. 2.5.4 Finding . 2.5.5 Choosing C and epsilon. 2.5.6 Computing epsilon : v -SVR 71. 2.6 The Big(ger) Picture. References. 3. Exploring and Exploiting a Surrogate. 3.1 Searching the Surrogate. 3.2 Infill Criteria. 3.2.1 Prediction Based Exploitation. 3.2.2 Error Based Exploration. 3.2.3 Balanced Exploitation and Exploration. 3.2.4 Conditional Likelihood Approaches. 3.2.5 Other Methods. 3.3 Managing a Surrogate Based Optimization Process. 3.3.1 Which Surrogate for What Use? 3.3.2 How Many Sample Plan and Infill Points? 3.3.3 Convergence Criteria. 3.3.4 Search of the Vibration Isolator Geometry Feasibility Using Kriging Goal Seeking. References. Part II: Advanced Concepts. 4. Visualization. 4.1 Matrices of Contour Plots. 4.2 Nested Dimensions. Reference. 5. Constraints. 5.1 Satisfaction of Constraints by Construction. 5.2 Penalty Functions. 5.3 Example Constrained Problem. 5.3.1 Using a Kriging Model of the Constraint Function. 5.3.2 Using a Kriging Model of the Objective Function. 5.4 Expected Improvement Based Approaches. 5.4.1 Expected Improvement With Simple Penalty Function. 5.4.2 Constrained Expected Improvement. 5.5 Missing Data. 5.5.1 Imputing Data for Infeasible Designs. 5.6 Design of a Helical Compression Spring Using Constrained Expected Improvement. 5.7 Summary. References. 6. Infill Criteria With Noisy Data. 6.1 Regressing Kriging. 6.2 Searching the Regression Model. 6.2.1 Re-Interpolation. 6.2.2 Re-Interpolation With Conditional Likelihood Approaches. 6.3 A Note on Matrix Ill-Conditioning. 6.4 Summary. References. 7. Exploiting Gradient Information. 7.1 Obtaining Gradients. 7.1.1 Finite Differencing. 7.1.2 Complex Step Approximation. 7.1.3 Adjoint Methods and Algorithmic Differentiation. 7.2 Gradient-enhanced Modelling. 7.3 Hessian-enhanced Modelling. 7.4 Summary. References. 8. Multi-fidelity Analysis. 8.1 Co-Kriging. 8.2 One-variable Demonstration. 8.3 Choosing X c and X e . 8.4 Summary. References. 9. Multiple Design Objectives. 9.1 Pareto Optimization. 9.2 Multi-objective Expected Improvement. 9.3 Design of the Nowacki Cantilever Beam Using Multi-objective, Constrained Expected Improvement. 9.4 Design of a Helical Compression Spring Using Multi-objective, Constrained Expected Improvement. 9.5 Summary. References. Appendix: Example Problems. A.1 One-Variable Test Function. A.2 Branin Test Function. A.3 Aerofoil Design. A.4 The Nowacki Beam. A.5 Multi-objective, Constrained Optimal Design of a Helical Compression Spring. A.6 Novel Passive Vibration Isolator Feasibility. References. Index.

1,447 citations

Journal ArticleDOI
TL;DR: This paper demonstrates the application of correlated Gaussian process based approximations to optimization where multiple levels of analysis are available, using an extension to the geostatistical method of co-kriging.
Abstract: This paper demonstrates the application of correlated Gaussian process based approximations to optimization where multiple levels of analysis are available, using an extension to the geostatistical method of co-kriging. An exchange algorithm is used to choose which points of the search space to sample within each level of analysis. The derivation of the co-kriging equations is presented in an intuitive manner, along with a new variance estimator to account for varying degrees of computational ‘noise’ in the multiple levels of analysis. A multi-fidelity wing optimization is used to demonstrate the methodology.

799 citations

Journal ArticleDOI
TL;DR: This paper reviews how the kriging interpolation can be modified to filter out numerical noise and shows how to adjust the estimate of the error in a kriged prediction so that previous approaches to optimization, such as the method of maximizing the expected improvement, continue to work effectively.
Abstract: of functions calculated by long running computer codes. The literature in this area commonly assumes that the objective function is a smooth, deterministic function of the inputs. Yet it is well known that many computer simulations,especiallythoseofcomputational fluidandstructuraldynamicscodes,oftendisplaywhatonemightcall numerical noise: rather than lying on a smooth curve, results appear to contain a random scatter about a smooth trend. This paper extends previous optimization methods based on the interpolating method ofkriging to the case of such noisy computer experiments. Firstly, we review how the kriging interpolation can be modified to filter out numerical noise. We then show how to adjust the estimate of the error in a kriging prediction so that previous approaches to optimization, such as the method of maximizing the expected improvement, continue to work effectively. We introduce the problems associated with noise and demonstrate our approach using computational fluid dynamics based problems.

225 citations


Cited by
More filters
Journal ArticleDOI

6,278 citations

Book
02 Sep 2008
TL;DR: This chapter discusses the design and exploration of a Surrogate-based kriging model, and some of the techniques used in that process, as well as some new approaches to designing models based on the data presented.
Abstract: Preface. About the Authors. Foreword. Prologue. Part I: Fundamentals. 1. Sampling Plans. 1.1 The 'Curse of Dimensionality' and How to Avoid It. 1.2 Physical versus Computational Experiments. 1.3 Designing Preliminary Experiments (Screening). 1.3.1 Estimating the Distribution of Elementary Effects. 1.4 Designing a Sampling Plan. 1.4.1 Stratification. 1.4.2 Latin Squares and Random Latin Hypercubes. 1.4.3 Space-filling Latin Hypercubes. 1.4.4 Space-filling Subsets. 1.5 A Note on Harmonic Responses. 1.6 Some Pointers for Further Reading. References. 2. Constructing a Surrogate. 2.1 The Modelling Process. 2.1.1 Stage One: Preparing the Data and Choosing a Modelling Approach. 2.1.2 Stage Two: Parameter Estimation and Training. 2.1.3 Stage Three: Model Testing. 2.2 Polynomial Models. 2.2.1 Example One: Aerofoil Drag. 2.2.2 Example Two: a Multimodal Testcase. 2.2.3 What About the k -variable Case? 2.3 Radial Basis Function Models. 2.3.1 Fitting Noise-Free Data. 2.3.2 Radial Basis Function Models of Noisy Data. 2.4 Kriging. 2.4.1 Building the Kriging Model. 2.4.2 Kriging Prediction. 2.5 Support Vector Regression. 2.5.1 The Support Vector Predictor. 2.5.2 The Kernel Trick. 2.5.3 Finding the Support Vectors. 2.5.4 Finding . 2.5.5 Choosing C and epsilon. 2.5.6 Computing epsilon : v -SVR 71. 2.6 The Big(ger) Picture. References. 3. Exploring and Exploiting a Surrogate. 3.1 Searching the Surrogate. 3.2 Infill Criteria. 3.2.1 Prediction Based Exploitation. 3.2.2 Error Based Exploration. 3.2.3 Balanced Exploitation and Exploration. 3.2.4 Conditional Likelihood Approaches. 3.2.5 Other Methods. 3.3 Managing a Surrogate Based Optimization Process. 3.3.1 Which Surrogate for What Use? 3.3.2 How Many Sample Plan and Infill Points? 3.3.3 Convergence Criteria. 3.3.4 Search of the Vibration Isolator Geometry Feasibility Using Kriging Goal Seeking. References. Part II: Advanced Concepts. 4. Visualization. 4.1 Matrices of Contour Plots. 4.2 Nested Dimensions. Reference. 5. Constraints. 5.1 Satisfaction of Constraints by Construction. 5.2 Penalty Functions. 5.3 Example Constrained Problem. 5.3.1 Using a Kriging Model of the Constraint Function. 5.3.2 Using a Kriging Model of the Objective Function. 5.4 Expected Improvement Based Approaches. 5.4.1 Expected Improvement With Simple Penalty Function. 5.4.2 Constrained Expected Improvement. 5.5 Missing Data. 5.5.1 Imputing Data for Infeasible Designs. 5.6 Design of a Helical Compression Spring Using Constrained Expected Improvement. 5.7 Summary. References. 6. Infill Criteria With Noisy Data. 6.1 Regressing Kriging. 6.2 Searching the Regression Model. 6.2.1 Re-Interpolation. 6.2.2 Re-Interpolation With Conditional Likelihood Approaches. 6.3 A Note on Matrix Ill-Conditioning. 6.4 Summary. References. 7. Exploiting Gradient Information. 7.1 Obtaining Gradients. 7.1.1 Finite Differencing. 7.1.2 Complex Step Approximation. 7.1.3 Adjoint Methods and Algorithmic Differentiation. 7.2 Gradient-enhanced Modelling. 7.3 Hessian-enhanced Modelling. 7.4 Summary. References. 8. Multi-fidelity Analysis. 8.1 Co-Kriging. 8.2 One-variable Demonstration. 8.3 Choosing X c and X e . 8.4 Summary. References. 9. Multiple Design Objectives. 9.1 Pareto Optimization. 9.2 Multi-objective Expected Improvement. 9.3 Design of the Nowacki Cantilever Beam Using Multi-objective, Constrained Expected Improvement. 9.4 Design of a Helical Compression Spring Using Multi-objective, Constrained Expected Improvement. 9.5 Summary. References. Appendix: Example Problems. A.1 One-Variable Test Function. A.2 Branin Test Function. A.3 Aerofoil Design. A.4 The Nowacki Beam. A.5 Multi-objective, Constrained Optimal Design of a Helical Compression Spring. A.6 Novel Passive Vibration Isolator Feasibility. References. Index.

2,335 citations

Journal ArticleDOI
TL;DR: The present state of the art of constructing surrogate models and their use in optimization strategies is reviewed and extensive use of pictorial examples are made to give guidance as to each method's strengths and weaknesses.

1,919 citations

MonographDOI
18 Jul 2008
TL;DR: In this article, the authors propose a sampling approach to estimate the distribution of elementary effects and then use this information to construct a kriging model of the data set, which is then used for regression.
Abstract: Preface. About the Authors. Foreword. Prologue. Part I: Fundamentals. 1. Sampling Plans. 1.1 The 'Curse of Dimensionality' and How to Avoid It. 1.2 Physical versus Computational Experiments. 1.3 Designing Preliminary Experiments (Screening). 1.3.1 Estimating the Distribution of Elementary Effects. 1.4 Designing a Sampling Plan. 1.4.1 Stratification. 1.4.2 Latin Squares and Random Latin Hypercubes. 1.4.3 Space-filling Latin Hypercubes. 1.4.4 Space-filling Subsets. 1.5 A Note on Harmonic Responses. 1.6 Some Pointers for Further Reading. References. 2. Constructing a Surrogate. 2.1 The Modelling Process. 2.1.1 Stage One: Preparing the Data and Choosing a Modelling Approach. 2.1.2 Stage Two: Parameter Estimation and Training. 2.1.3 Stage Three: Model Testing. 2.2 Polynomial Models. 2.2.1 Example One: Aerofoil Drag. 2.2.2 Example Two: a Multimodal Testcase. 2.2.3 What About the k -variable Case? 2.3 Radial Basis Function Models. 2.3.1 Fitting Noise-Free Data. 2.3.2 Radial Basis Function Models of Noisy Data. 2.4 Kriging. 2.4.1 Building the Kriging Model. 2.4.2 Kriging Prediction. 2.5 Support Vector Regression. 2.5.1 The Support Vector Predictor. 2.5.2 The Kernel Trick. 2.5.3 Finding the Support Vectors. 2.5.4 Finding . 2.5.5 Choosing C and epsilon. 2.5.6 Computing epsilon : v -SVR 71. 2.6 The Big(ger) Picture. References. 3. Exploring and Exploiting a Surrogate. 3.1 Searching the Surrogate. 3.2 Infill Criteria. 3.2.1 Prediction Based Exploitation. 3.2.2 Error Based Exploration. 3.2.3 Balanced Exploitation and Exploration. 3.2.4 Conditional Likelihood Approaches. 3.2.5 Other Methods. 3.3 Managing a Surrogate Based Optimization Process. 3.3.1 Which Surrogate for What Use? 3.3.2 How Many Sample Plan and Infill Points? 3.3.3 Convergence Criteria. 3.3.4 Search of the Vibration Isolator Geometry Feasibility Using Kriging Goal Seeking. References. Part II: Advanced Concepts. 4. Visualization. 4.1 Matrices of Contour Plots. 4.2 Nested Dimensions. Reference. 5. Constraints. 5.1 Satisfaction of Constraints by Construction. 5.2 Penalty Functions. 5.3 Example Constrained Problem. 5.3.1 Using a Kriging Model of the Constraint Function. 5.3.2 Using a Kriging Model of the Objective Function. 5.4 Expected Improvement Based Approaches. 5.4.1 Expected Improvement With Simple Penalty Function. 5.4.2 Constrained Expected Improvement. 5.5 Missing Data. 5.5.1 Imputing Data for Infeasible Designs. 5.6 Design of a Helical Compression Spring Using Constrained Expected Improvement. 5.7 Summary. References. 6. Infill Criteria With Noisy Data. 6.1 Regressing Kriging. 6.2 Searching the Regression Model. 6.2.1 Re-Interpolation. 6.2.2 Re-Interpolation With Conditional Likelihood Approaches. 6.3 A Note on Matrix Ill-Conditioning. 6.4 Summary. References. 7. Exploiting Gradient Information. 7.1 Obtaining Gradients. 7.1.1 Finite Differencing. 7.1.2 Complex Step Approximation. 7.1.3 Adjoint Methods and Algorithmic Differentiation. 7.2 Gradient-enhanced Modelling. 7.3 Hessian-enhanced Modelling. 7.4 Summary. References. 8. Multi-fidelity Analysis. 8.1 Co-Kriging. 8.2 One-variable Demonstration. 8.3 Choosing X c and X e . 8.4 Summary. References. 9. Multiple Design Objectives. 9.1 Pareto Optimization. 9.2 Multi-objective Expected Improvement. 9.3 Design of the Nowacki Cantilever Beam Using Multi-objective, Constrained Expected Improvement. 9.4 Design of a Helical Compression Spring Using Multi-objective, Constrained Expected Improvement. 9.5 Summary. References. Appendix: Example Problems. A.1 One-Variable Test Function. A.2 Branin Test Function. A.3 Aerofoil Design. A.4 The Nowacki Beam. A.5 Multi-objective, Constrained Optimal Design of a Helical Compression Spring. A.6 Novel Passive Vibration Isolator Feasibility. References. Index.

1,447 citations

Journal ArticleDOI
08 Aug 2019
TL;DR: A comprehensive overview and analysis of the most recent research in machine learning principles, algorithms, descriptors, and databases in materials science, and proposes solutions and future research paths for various challenges in computational materials science.
Abstract: One of the most exciting tools that have entered the material science toolbox in recent years is machine learning. This collection of statistical methods has already proved to be capable of considerably speeding up both fundamental and applied research. At present, we are witnessing an explosion of works that develop and apply machine learning to solid-state systems. We provide a comprehensive overview and analysis of the most recent research in this topic. As a starting point, we introduce machine learning principles, algorithms, descriptors, and databases in materials science. We continue with the description of different machine learning approaches for the discovery of stable materials and the prediction of their crystal structure. Then we discuss research in numerous quantitative structure–property relationships and various approaches for the replacement of first-principle methods by machine learning. We review how active learning and surrogate-based optimization can be applied to improve the rational design process and related examples of applications. Two major questions are always the interpretability of and the physical understanding gained from machine learning models. We consider therefore the different facets of interpretability and their importance in materials science. Finally, we propose solutions and future research paths for various challenges in computational materials science.

1,301 citations