scispace - formally typeset
Open AccessBook

Uncertainty Quantification: Theory, Implementation, and Applications

Reads0
Chats0
TLDR
Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines.
Abstract
The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers can find data used in the exercises and other supplementary material. Uncertainty Quantification: Theory, Implementation, and Applications includes a large number of definitions and examples that use a suite of relatively simple models to illustrate concepts; numerous references to current and open research issues; and exercises that illustrate basic concepts and guide readers through the numerical implementation of algorithms for prototypical problems. It also features a wide range of applications, including weather and climate models, subsurface hydrology and geology models, nuclear power plant design, and models for biological phenomena, along with recent advances and topics that have appeared in the research literature within the last 15 years, including aspects of Bayesian model calibration, surrogate model development, parameter selection techniques, and global sensitivity analysis. Audience: The text is intended for advanced undergraduates, graduate students, and researchers in mathematics, statistics, operations research, computer science, biology, science, and engineering. It can be used as a textbook for one- or two-semester courses on uncertainty quantification or as a resource for researchers in a wide array of disciplines. A basic knowledge of probability, linear algebra, ordinary and partial differential equations, and introductory numerical analysis techniques is assumed. Contents: Chapter 1: Introduction; Chapter 2: Large-Scale Applications; Chapter 3: Prototypical Models; Chapter 4: Fundamentals of Probability, Random Processes, and Statistics; Chapter 5: Representation of Random Inputs; Chapter 6: Parameter Selection Techniques; Chapter 7: Frequentist Techniques for Parameter Estimation; Chapter 8: Bayesian Techniques for Parameter Estimation; Chapter 9: Uncertainty Propagation in Models; Chapter 10: Stochastic Spectral Methods; Chapter 11: Sparse Grid Quadrature and Interpolation Techniques; Chapter 12: Prediction in the Presence of Model Discrepancy; Chapter 13: Surrogate Models; Chapter 14: Local Sensitivity Analysis; Chapter 15: Global Sensitivity Analysis; Appendix A: Concepts from Functional Analysis; Bibliography; Index

read more

Citations
More filters
Journal ArticleDOI

A Non-Stationary 1981–2012 AVHRR NDVI3g Time Series

TL;DR: The NDVI3g time series is an improved 8-km normalized difference vegetation index (NDVI) data set produced from Advanced Very High Resolution Radiometer (AVHRR) instruments that extends from 1981 to the present as discussed by the authors.
Journal ArticleDOI

Digital Twin: Values, Challenges and Enablers From a Modeling Perspective

TL;DR: This work reviews the recent status of methodologies and techniques related to the construction of digital twins mostly from a modeling perspective to provide a detailed coverage of the current challenges and enabling technologies along with recommendations and reflections for various stakeholders.
Journal ArticleDOI

A paradigm for data-driven predictive modeling using field inversion and machine learning

TL;DR: In contrast to inferring model parameters, this work uses inverse modeling to obtain corrective, spatially distributed functional terms, offering a route to directly address model-form errors.
Journal ArticleDOI

Deep UQ: Learning deep neural network surrogate models for high dimensional uncertainty quantification

TL;DR: Deep neural networks (DNN) are used to construct surrogate models for numerical simulators in a manner that lends the DNN surrogate the interpretation of recovering a low-dimensional nonlinear manifold.
Book

Active Subspaces: Emerging Ideas for Dimension Reduction in Parameter Studies

TL;DR: Active subspaces are an emerging set of dimension reduction tools that identify important directions in the parameter space as mentioned in this paper, and they can be used to enable parameter studies when the model is expensive and the model has many inputs.