scispace - formally typeset
Open AccessJournal ArticleDOI

Review of multi-fidelity models

Reads0
Chats0
TLDR
It is found that time savings are highly problem dependent and that MFM methods provided time savings up to 90% and guidelines for authors to present their MFM savings in a way that is useful to future MFM users are included.
Abstract
Simulations are often computationally expensive and the need for multiple realizations, as in uncertainty quantification or optimization, makes surrogate models an attractive option. For expensive high-fidelity models (HFMs), however, even performing the number of simulations needed for fitting a surrogate may be too expensive. Inexpensive but less accurate low-fidelity models (LFMs) are often also available. Multi-fidelity models (MFMs) combine HFMs and LFMs in order to achieve accuracy at a reasonable cost. With the increasing popularity of MFMs in mind, the aim of this paper is to summarize the state-of-the-art of MFM trends. For this purpose, publications in this field are classified based on application, surrogate selection if any, the difference between fidelities, the method used to combine these fidelities, the field of application and the year published. Available methods of combining fidelities are also reviewed, focusing our attention especially on multi-fidelity surrogate models in which fidelities are combined inside a surrogate model. Computation time savings are usually the reason for using MFMs, hence it is important to properly report the achieved savings. Unfortunately, we find that many papers do not present sufficient information to determine these savings. Therefore, the paper also includes guidelines for authors to present their MFM savings in a way that is useful to future MFM users. Based on papers that provided enough information, we find that time savings are highly problem dependent and that MFM methods we surveyed provided time savings up to 90%. Keywords: Multi-fidelity, Variable-complexity, Variable-fidelity, Surrogate models, Optimization, Uncertainty quantification, Review, Survey

read more

Citations
More filters
Journal ArticleDOI

A composite neural network that learns from multi-fidelity data: Application to function approximation and inverse PDE problems

TL;DR: In this paper, the authors proposed a new composite neural network (NN) that can be trained based on multi-fidelity data, which is comprised of three NNs, with the first NN trained using the low fidelity data and coupled to two high fidelity NN, one with activation functions and another one without, in order to discover and exploit nonlinear and linear correlations, respectively, between the low-idelity and the high fidelity data.
Journal ArticleDOI

A survey of adaptive sampling for global metamodeling in support of simulation-based complex engineering design

TL;DR: This article categorizes, reviews, and analyzes the state-of-the-art single−/multi-response adaptive sampling approaches for global metamodeling in support of simulation-based engineering design and discusses some important issues that affect the success of an adaptive sampling approach.
Posted Content

Hyper-Parameter Optimization: A Review of Algorithms and Applications

Tong Yu, +1 more
- 12 Mar 2020 - 
TL;DR: A review of the most essential topics on HPO, including the key hyper-parameters related to model training and structure, and a comparison between optimization algorithms, and prominent approaches for model evaluation with limited computational resources.
Journal ArticleDOI

The Challenge of Machine Learning in Space Weather: Nowcasting and Forecasting

TL;DR: The recurring themes throughout the review are the need to shift the authors' forecasting paradigm to a probabilistic approach focused on the reliable assessment of uncertainties, and the combination of physics‐based and machine learning approaches, known as gray box.
Journal ArticleDOI

Remarks on multi-output Gaussian process regression

TL;DR: This article investigates the state-of-the-art multi-output Gaussian processes (MOGPs) that can transfer the knowledge across related outputs in order to improve prediction quality and gives some recommendations regarding the usage of MOGPs.
References
More filters
Book

Response Surface Methodology: Process and Product Optimization Using Designed Experiments

TL;DR: Using a practical approach, this book discusses two-level factorial and fractional factorial designs, several aspects of empirical modeling with regression techniques, focusing on response surface methodology, mixture experiments and robust design techniques.
Journal ArticleDOI

The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations

TL;DR: This work represents the stochastic processes with an optimum trial basis from the Askey family of orthogonal polynomials that reduces the dimensionality of the system and leads to exponential convergence of the error.
Journal ArticleDOI

Principles of geostatistics

G. Matheron
- 01 Dec 1963 - 
TL;DR: In this article, the authors present a new science leading to such an approach, namely geostatistics, which is a new approach for estimating the estimation of ore grades and reserves.
Journal ArticleDOI

Bayesian Calibration of computer models

TL;DR: A Bayesian calibration technique which improves on this traditional approach in two respects and attempts to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best‐fitting parameter values is presented.
Book

Monte Carlo methods

TL;DR: The general nature of Monte Carlo methods can be found in this paper, where a short resume of statistical terms is given, including random, pseudorandom, and quasirandom numbers.
Related Papers (5)