scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Improving variable-fidelity surrogate modeling via gradient-enhanced kriging and a generalized hybrid bridge function

TL;DR: It is shown that the gradient-enhanced GHBF proposed in this paper is very promising and can be used to significantly improve the efficiency, accuracy and robustness of VFM in the context of aero-loads prediction.
About: This article is published in Aerospace Science and Technology.The article was published on 2013-03-01. It has received 313 citations till now. The article focuses on the topics: Surrogate model & Kriging.
Citations
More filters
Journal ArticleDOI
TL;DR: In many situations across computational science and engineering, multiple computational models are available that describe a system of interest as discussed by the authors, and these different models have varying evaluation costs, i.e.
Abstract: In many situations across computational science and engineering, multiple computational models are available that describe a system of interest. These different models have varying evaluation costs...

678 citations

Journal ArticleDOI
TL;DR: It is observed that hierarchical kriging provides a more reasonable mean-squared-error estimation than traditional cokriging and can be applied to the efficient aerodynamic analysis and shape optimization of aircraft or anywhere where computer codes of varying fidelity are in use.
Abstract: The efficiency of building a surrogate model for the output of a computer code can be dramatically improved via variable-fidelity surrogate modeling techniques. In this article, a hierarchical kriging model is proposed and used for variable-fidelity surrogate modeling problems. Here, hierarchical kriging refers to a surrogate model of a highfidelity function that uses a kriging model of a sampled lower-fidelity function as a model trend. As a consequence, the variation in the lower-fidelity data is mapped to the high-fidelity data, and a more accurate surrogate model for the high-fidelity function is obtained. A self-contained derivation of the hierarchical kriging model is presented. The proposed method is demonstrated with an analytical example and used for modeling the aerodynamic data of an RAE 2822 airfoil and an industrial transport aircraft configuration. The numerical examples show that it is efficient, accurate, and robust. It is also observed that hierarchical kriging provides a more reasonable mean-squared-error estimation than traditional cokriging. It can be applied to the efficient aerodynamic analysis and shape optimization of aircraft or any other research areas where computer codes of varying fidelity are in use.

280 citations


Cites background or methods from "Improving variable-fidelity surroga..."

  • ...In [19], this generalize hybrid bridge function was combined with gradient-enhanced kriging (GEK),with the gradients computed by adjointmethod [20])....

    [...]

  • ...More recently, a more general method called generalized hybrid bridge function was proposed in [19] and applied in the context of aerodata for loads prediction of aircraft....

    [...]

  • ...readily extended to three (or more) levels of fidelity (see the Appendix) and to include gradient information (see [19])....

    [...]

Journal ArticleDOI
TL;DR: This article categorizes, reviews, and analyzes the state-of-the-art single−/multi-response adaptive sampling approaches for global metamodeling in support of simulation-based engineering design and discusses some important issues that affect the success of an adaptive sampling approach.
Abstract: Metamodeling is becoming a rather popular means to approximate the expensive simulations in today’s complex engineering design problems since accurate metamodels can bring in a lot of benefits. The metamodel accuracy, however, heavily depends on the locations of the observed points. Adaptive sampling, as its name suggests, places more points in regions of interest by learning the information from previous data and metamodels. Consequently, compared to traditional space-filling sampling approaches, adaptive sampling has great potential to build more accurate metamodels with fewer points (simulations), thereby gaining increasing attention and interest by both practitioners and academicians in various fields. Noticing that there is a lack of reviews on adaptive sampling for global metamodeling in the literature, which is needed, this article categorizes, reviews, and analyzes the state-of-the-art single−/multi-response adaptive sampling approaches for global metamodeling in support of simulation-based engineering design. In addition, we also review and discuss some important issues that affect the success of an adaptive sampling approach as well as providing brief remarks on adaptive sampling for other purposes. Last, challenges and future research directions are provided and discussed.

276 citations


Cites methods from "Improving variable-fidelity surroga..."

  • ..., scaling function basedmodeling (Han et al. 2013; Zhou et al. 2017) and Bayesian...

    [...]

  • ...TheMFM frameworks, e.g., scaling function basedmodeling (Han et al. 2013; Zhou et al. 2017) and Bayesian multi-fidelity modeling (also called Co-Kriging) (Kennedy and O’Hagan 2000, 2001; Forrester et al. 2007; Qian andWu 2008), have gained popularity in multidisciplinary design, optimization and…...

    [...]

Journal ArticleDOI
TL;DR: It is found that time savings are highly problem dependent and that MFM methods provided time savings up to 90% and guidelines for authors to present their MFM savings in a way that is useful to future MFM users are included.
Abstract: Simulations are often computationally expensive and the need for multiple realizations, as in uncertainty quantification or optimization, makes surrogate models an attractive option. For expensive high-fidelity models (HFMs), however, even performing the number of simulations needed for fitting a surrogate may be too expensive. Inexpensive but less accurate low-fidelity models (LFMs) are often also available. Multi-fidelity models (MFMs) combine HFMs and LFMs in order to achieve accuracy at a reasonable cost. With the increasing popularity of MFMs in mind, the aim of this paper is to summarize the state-of-the-art of MFM trends. For this purpose, publications in this field are classified based on application, surrogate selection if any, the difference between fidelities, the method used to combine these fidelities, the field of application and the year published. Available methods of combining fidelities are also reviewed, focusing our attention especially on multi-fidelity surrogate models in which fidelities are combined inside a surrogate model. Computation time savings are usually the reason for using MFMs, hence it is important to properly report the achieved savings. Unfortunately, we find that many papers do not present sufficient information to determine these savings. Therefore, the paper also includes guidelines for authors to present their MFM savings in a way that is useful to future MFM users. Based on papers that provided enough information, we find that time savings are highly problem dependent and that MFM methods we surveyed provided time savings up to 90%. Keywords: Multi-fidelity, Variable-complexity, Variable-fidelity, Surrogate models, Optimization, Uncertainty quantification, Review, Survey

217 citations

Journal ArticleDOI
TL;DR: The surrogate modeling toolbox (SMT) is an open-source Python package that contains a collection of surrogate modeling methods, sampling techniques, and benchmarking functions that provides a library of surrogate models that is simple to use and facilitates the implementation of additional methods.

195 citations


Cites methods from "Improving variable-fidelity surroga..."

  • ...It then uses the differences between the high- and low-fidelity evaluations to construct a bridge function that corrects the low-fidelity model [21]....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: This paper presents a meta-modelling framework for estimating Output from Computer Experiments-Predicting Output from Training Data and Criteria Based Designs for computer Experiments.
Abstract: Many scientific phenomena are now investigated by complex computer models or codes A computer experiment is a number of runs of the code with various inputs A feature of many computer experiments is that the output is deterministic--rerunning the code with the same inputs gives identical observations Often, the codes are computationally expensive to run, and a common objective of an experiment is to fit a cheaper predictor of the output to the data Our approach is to model the deterministic output as the realization of a stochastic process, thereby providing a statistical basis for designing experiments (choosing the inputs) for efficient prediction With this model, estimates of uncertainty of predictions are also available Recent work in this area is reviewed, a number of applications are discussed, and we demonstrate our methodology with an example

6,583 citations

Journal ArticleDOI
TL;DR: In this article, the authors present a new science leading to such an approach, namely geostatistics, which is a new approach for estimating the estimation of ore grades and reserves.
Abstract: Knowledge of ore grades and ore reserves as well as error estimation of these values, is fundamental for mining engineers and mining geologists. Until now no appropriate scientific approach to those estimation problems has existed: geostatistics, the principles of which are summarized in this paper, constitutes a new science leading to such an approach. The author criticizes classical statistical methods still in use, and shows some of the main results given by geostatistics. Any ore deposit evaluation as well as proper decision of starting mining operations should be preceded by a geostatistical investigation which may avoid economic failures.

4,203 citations


"Improving variable-fidelity surroga..." refers methods in this paper

  • ...Kriging is a statistical interpolation method suggested by Krige [14] in 1951 and mathematically formulated by Matheron [15] in 1963....

    [...]

Journal Article
TL;DR: In this paper, the application of the lognormal curve to the frequency distribution of gold values is discussed, and some fundamental concepts in application of statistics to mine valuation on the Witwatersrand are discussed.
Abstract: Certain fundamental concepts in the application of statistics to mine valuation on the Witwatersrand are discussed, and general conclusions are drawn regarding the application of the lognormal curve to the frequency distribution of gold values An indication is given of the reliability of present valuation methods on the Rand It is shown that the existing over- and under-valuation of blocks of ore listed as high-grade and low-grade, respectively, can be explained statistically Suggestions are made for the elimination of such errors and for the improvement of the general standard of mine valuation by the use of statistical theory

2,353 citations

Journal ArticleDOI
TL;DR: The present state of the art of constructing surrogate models and their use in optimization strategies is reviewed and extensive use of pictorial examples are made to give guidance as to each method's strengths and weaknesses.

1,919 citations


"Improving variable-fidelity surroga..." refers background in this paper

  • ...is demonstrated for an analytic problem taken from [21]....

    [...]

Journal ArticleDOI
TL;DR: An analytically robust, globally convergent approach to managing the use of approximation models of varying fidelity in optimization, based on the trust region idea from nonlinear programming, which is shown to be provably convergent to a solution of the original high-fidelity problem.
Abstract: This paper presents an analytically robust, globally convergent approach to managing the use of approximation models of various fidelity in optimization. By robust global behavior we mean the mathematical assurance that the iterates produced by the optimization algorithm, started at an arbitrary initial iterate, will converge to a stationary point or local optimizer for the original problem. The approach we present is based on the trust region idea from nonlinear programming and is shown to be provably convergent to a solution of the original high-fidelity problem. The proposed method for managing approximations in engineering optimization suggests ways to decide when the fidelity, and thus the cost, of the approximations might be fruitfully increased or decreased in the course of the optimization iterations. The approach is quite general. We make no assumptions on the structure of the original problem, in particular, no assumptions of convexity and separability, and place only mild requirements on the approximations. The approximations used in the framework can be of any nature appropriate to an application; for instance, they can be represented by analyses, simulations, or simple algebraic models. This paper introduces the approach and outlines the convergence analysis.

651 citations