scispace - formally typeset
Search or ask a question
Topic

Gaussian process emulator

About: Gaussian process emulator is a research topic. Over the lifetime, 126 publications have been published within this topic receiving 16268 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper presents a meta-modelling framework for estimating Output from Computer Experiments-Predicting Output from Training Data and Criteria Based Designs for computer Experiments.
Abstract: Many scientific phenomena are now investigated by complex computer models or codes A computer experiment is a number of runs of the code with various inputs A feature of many computer experiments is that the output is deterministic--rerunning the code with the same inputs gives identical observations Often, the codes are computationally expensive to run, and a common objective of an experiment is to fit a cheaper predictor of the output to the data Our approach is to model the deterministic output as the realization of a stochastic process, thereby providing a statistical basis for designing experiments (choosing the inputs) for efficient prediction With this model, estimates of uncertainty of predictions are also available Recent work in this area is reviewed, a number of applications are discussed, and we demonstrate our methodology with an example

6,583 citations

Journal ArticleDOI
TL;DR: A Bayesian calibration technique which improves on this traditional approach in two respects and attempts to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best‐fitting parameter values is presented.
Abstract: We consider prediction and uncertainty analysis for systems which are approximated using complex mathematical models. Such models, implemented as computer codes, are often generic in the sense that by a suitable choice of some of the model's input parameters the code can be used to predict the behaviour of the system in a variety of specific applications. However, in any specific application the values of necessary parameters may be unknown. In this case, physical observations of the system in the specific context are used to learn about the unknown parameters. The process of fitting the model to the observed data by adjusting the parameters is known as calibration. Calibration is typically effected by ad hoc fitting, and after calibration the model is used, with the fitted input values, to predict the future behaviour of the system. We present a Bayesian calibration technique which improves on this traditional approach in two respects. First, the predictions allow for all sources of uncertainty, including the remaining uncertainty over the fitted parameters. Second, they attempt to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best-fitting parameter values. The method is illustrated by using data from a nuclear radiation release at Tomsk, and from a more complex simulated nuclear accident exercise.

3,745 citations

Journal ArticleDOI
TL;DR: In this article, the authors present a Bayesian framework which unifies the various tools of probabilistic sensitivity analysis, which allows effective sensitivity analysis to be achieved by using far smaller numbers of model runs than standard Monte Carlo methods.
Abstract: Summary. In many areas of science and technology, mathematical models are built to simulate complex real world phenomena. Such models are typically implemented in large computer programs and are also very complex, such that the way that the model responds to changes in its inputs is not transparent. Sensitivity analysis is concerned with understanding how changes in the model inputs influence the outputs. This may be motivated simply by a wish to understand the implications of a complex model but often arises because there is uncertainty about the true values of the inputs that should be used for a particular application. A broad range of measures have been advocated in the literature to quantify and describe the sensitivity of a model's output to variation in its inputs. In practice the most commonly used measures are those that are based on formulating uncertainty in the model inputs by a joint probability distribution and then analysing the induced uncertainty in outputs, an approach which is known as probabilistic sensitivity analysis. We present a Bayesian framework which unifies the various tools of prob- abilistic sensitivity analysis. The Bayesian approach is computationally highly efficient. It allows effective sensitivity analysis to be achieved by using far smaller numbers of model runs than standard Monte Carlo methods. Furthermore, all measures of interest may be computed from a single set of runs.

1,074 citations

Journal ArticleDOI
TL;DR: This work focuses on combining observations from field experiments with detailed computer simulations of a physical process to carry out statistical inference, and makes use of basis representations to reduce the dimensionality of the problem and speed up the computations required for exploring the posterior distribution.
Abstract: This work focuses on combining observations from field experiments with detailed computer simulations of a physical process to carry out statistical inference. Of particular interest here is determining uncertainty in resulting predictions. This typically involves calibration of parameters in the computer simulator as well as accounting for inadequate physics in the simulator. The problem is complicated by the fact that simulation code is sufficiently demanding that only a limited number of simulations can be carried out. We consider applications in characterizing material properties for which the field data and the simulator output are highly multivariate. For example, the experimental data and simulation output may be an image or may describe the shape of a physical object. We make use of the basic framework of Kennedy and O'Hagan. However, the size and multivariate nature of the data lead to computational challenges in implementing the framework. To overcome these challenges, we make use of basis repre...

838 citations

Journal ArticleDOI
TL;DR: This article is concerned with prediction of a function y(t) over a (multidimensional) domain T, given the function values at a set of “sites” in T, and with the design, that is, with the selection of those sites.
Abstract: This article is concerned with prediction of a function y(t) over a (multidimensional) domain T, given the function values at a set of “sites” {t (1), t (2), …, t (n)} in T, and with the design, that is, with the selection of those sites. The motivating application is the design and analysis of computer experiments, where t determines the input to a computer model of a physical or behavioral system, and y(t) is a response that is part of the output or is calculated from it. Following a Bayesian formulation, prior uncertainty about the function y is expressed by means of a random function Y, which is taken here to be a Gaussian stochastic process. The mean of the posterior process can be used as the prediction function ŷ(t), and the variance can be used as a measure of uncertainty. This kind of approach has been used previously in Bayesian interpolation and is strongly related to the kriging methods used in geostatistics. Here emphasis is placed on product linear and product cubic correlation func...

789 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
73% related
Inference
36.8K papers, 1.3M citations
71% related
Probabilistic logic
56K papers, 1.3M citations
68% related
Regression analysis
31K papers, 1.7M citations
68% related
Monte Carlo method
95.9K papers, 2.1M citations
68% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202113
202023
20195
201815
201718
201611