Bayesian Treed Gaussian Process Models With an Application to Computer Modeling
TLDR
In this paper, a non-stationary modeling methodologies that couple stationary Gaussian processes with treed partitioning is presented. But this method is not applicable to the design of a rocket booster.Abstract:
Motivated by a computer experiment for the design of a rocket booster, this article explores nonstationary modeling methodologies that couple stationary Gaussian processes with treed partitioning. Partitioning is a simple but effective method for dealing with nonstationarity. The methodological developments and statistical computing details that make this approach efficient are described in detail. In addition to providing an analysis of the rocket booster simulator, we show that our approach is effective in other arenas as well.read more
Citations
More filters
Journal ArticleDOI
Survey of Multifidelity Methods in Uncertainty Propagation, Inference, and Optimization
TL;DR: In many situations across computational science and engineering, multiple computational models are available that describe a system of interest as discussed by the authors, and these different models have varying evaluation costs, i.e.
Journal ArticleDOI
Review of surrogate modeling in water resources
TL;DR: Two broad families of surrogates namely response surface surrogates, which are statistical or empirical data‐driven models emulating the high‐fidelity model responses, and lower‐f fidelity physically based surrogates which are simplified models of the original system are detailed in this paper.
Journal ArticleDOI
Choosing the Sample Size of a Computer Experiment: A Practical Guide
TL;DR: In this paper, the authors quantify two key characteristics of computer codes that affect the sample size required for a desired level of accuracy when approximating the code via a Gaussian process (GP) and provide reasons and evidence supporting the informal rule that the number of runs for an effective initial computer experiment should be about 10 times the input dimension.
Journal ArticleDOI
A tutorial on Gaussian process regression: Modelling, exploring, and exploiting functions
TL;DR: This tutorial introduces the reader to Gaussian process regression as an expressive tool to model, actively explore and exploit unknown functions and describes a situation modelling risk-averse exploration in which an additional constraint needs to be accounted for.
Journal ArticleDOI
Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets.
TL;DR: A class of highly scalable nearest-neighbor Gaussian process (NNGP) models to provide fully model-based inference for large geostatistical datasets are developed and it is established that the NNGP is a well-defined spatial process providing legitimate finite-dimensional Gaussian densities with sparse precision matrices.
References
More filters
Book
Introduction to Algorithms
TL;DR: The updated new edition of the classic Introduction to Algorithms is intended primarily for use in undergraduate or graduate courses in algorithms or data structures and presents a rich variety of algorithms and covers them in considerable depth while making their design and analysis accessible to all levels of readers.
Book
Classification and regression trees
TL;DR: The methodology used to construct tree structured rules is the focus of a monograph as mentioned in this paper, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.
BookDOI
Markov Chain Monte Carlo in Practice
TL;DR: The Markov Chain Monte Carlo Implementation Results Summary and Discussion MEDICAL MONITORING Introduction Modelling Medical Monitoring Computing Posterior Distributions Forecasting Model Criticism Illustrative Application Discussion MCMC for NONLINEAR HIERARCHICAL MODELS.
Journal ArticleDOI
The design and analysis of computer experiments
TL;DR: This paper presents a meta-modelling framework for estimating Output from Computer Experiments-Predicting Output from Training Data and Criteria Based Designs for computer Experiments.
Journal ArticleDOI
Bayesian Calibration of computer models
Marc C. Kennedy,Anthony O'Hagan +1 more
TL;DR: A Bayesian calibration technique which improves on this traditional approach in two respects and attempts to correct for any inadequacy of the model which is revealed by a discrepancy between the observed data and the model predictions from even the best‐fitting parameter values is presented.