scispace - formally typeset
Open AccessPosted Content

Bayesian treed Gaussian process models with an application to computer modeling

Reads0
Chats0
TLDR
This article explores nonstationary modeling methodologies that couple stationary Gaussian processes with treed partitioning and shows that this approach is effective in other arenas as well.
Abstract
Motivated by a computer experiment for the design of a rocket booster, this paper explores nonstationary modeling methodologies that couple stationary Gaussian processes with treed partitioning. Partitioning is a simple but effective method for dealing with nonstationarity. The methodological developments and statistical computing details which make this approach efficient are described in detail. In addition to providing an analysis of the rocket booster simulator, our approach is demonstrated to be effective in other arenas.

read more

Citations
More filters
Journal ArticleDOI

Survey of Multifidelity Methods in Uncertainty Propagation, Inference, and Optimization

TL;DR: In many situations across computational science and engineering, multiple computational models are available that describe a system of interest as discussed by the authors, and these different models have varying evaluation costs, i.e.
Journal ArticleDOI

Review of surrogate modeling in water resources

TL;DR: Two broad families of surrogates namely response surface surrogates, which are statistical or empirical data‐driven models emulating the high‐fidelity model responses, and lower‐f fidelity physically based surrogates which are simplified models of the original system are detailed in this paper.
Journal ArticleDOI

Choosing the Sample Size of a Computer Experiment: A Practical Guide

TL;DR: In this paper, the authors quantify two key characteristics of computer codes that affect the sample size required for a desired level of accuracy when approximating the code via a Gaussian process (GP) and provide reasons and evidence supporting the informal rule that the number of runs for an effective initial computer experiment should be about 10 times the input dimension.
Journal ArticleDOI

A tutorial on Gaussian process regression: Modelling, exploring, and exploiting functions

TL;DR: This tutorial introduces the reader to Gaussian process regression as an expressive tool to model, actively explore and exploit unknown functions and describes a situation modelling risk-averse exploration in which an additional constraint needs to be accounted for.
Journal ArticleDOI

Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets.

TL;DR: A class of highly scalable nearest-neighbor Gaussian process (NNGP) models to provide fully model-based inference for large geostatistical datasets are developed and it is established that the NNGP is a well-defined spatial process providing legitimate finite-dimensional Gaussian densities with sparse precision matrices.
References
More filters
BookDOI

Markov Chain Monte Carlo in Practice

TL;DR: The Markov Chain Monte Carlo Implementation Results Summary and Discussion MEDICAL MONITORING Introduction Modelling Medical Monitoring Computing Posterior Distributions Forecasting Model Criticism Illustrative Application Discussion MCMC for NONLINEAR HIERARCHICAL MODELS.
Journal ArticleDOI

The design and analysis of computer experiments

TL;DR: This paper presents a meta-modelling framework for estimating Output from Computer Experiments-Predicting Output from Training Data and Criteria Based Designs for computer Experiments.
Journal ArticleDOI

Bayesian Calibration of computer models

TL;DR: A Bayesian calibration technique which improves on this traditional approach in two respects and attempts to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best‐fitting parameter values is presented.
Related Papers (5)