scispace - formally typeset
Search or ask a question

Showing papers on "Uncertainty quantification published in 2006"


Journal ArticleDOI
TL;DR: A new validation metric is developed that is based on the statistical concept of confidence intervals and constructed two specific metrics: one that requires interpolation of experimental data andOne that requires regression (curve fitting) of experimentalData.

424 citations


Journal ArticleDOI
TL;DR: These procedures are applied to select and adjust ground-motion models for the analysis of seismic hazard at rock sites in West Central Europe, chosen for illustrative purposes particularly because it highlights the issue of using ground- motion models derived from small magnitude earthquakes in theAnalysis of hazard due to much larger events.
Abstract: A vital component of any seismic hazard analysis is a model for predicting the expected distribution of ground motions at a site due to possible earthquake scenarios. The limited nature of the datasets from which such models are derived gives rise to epistemic uncertainty in both the median estimates and the associated aleatory variability of these predictive equations. In order to capture this epistemic uncertainty in a seismic hazard analysis, more than one ground-motion prediction equation must be used, and the tool that is currently employed to combine multiple models is the logic tree. Candidate ground-motion models for a logic tree should be selected in order to obtain the smallest possible suite of equations that can capture the expected range of possible ground motions in the target region. This is achieved by starting from a comprehensive list of available equations and then applying criteria for rejecting those considered inappropriate in terms of quality, derivation or applicability. Once the final list of candidate models is established, adjustments must be applied to achieve parameter compatibility. Additional adjustments can also be applied to remove the effect of systematic differences between host and target regions. These procedures are applied to select and adjust ground-motion models for the analysis of seismic hazard at rock sites in West Central Europe. This region is chosen for illustrative purposes particularly because it highlights the issue of using ground-motion models derived from small magnitude earthquakes in the analysis of hazard due to much larger events. Some of the pitfalls of extrapolating ground-motion models from small to large magnitude earthquakes in low seismicity regions are discussed for the selected target region.

346 citations


Journal ArticleDOI
TL;DR: This paper focuses on recent application of PC methods for uncertainty representation and propagation in CFD computations, focusing exclusively on applications involving the unreduced Navier–Stokes equations.

257 citations


ReportDOI
01 Oct 2006
TL;DR: This report serves as a user’s manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.
Abstract: The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogatebased optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user’s manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies. Dakota Version 6.11 User’s Manual generated on November 7, 2019

230 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed an end-to-end uncertainty analysis framework that can quantify satellite-based precipitation estimation error characteristics and assess the influence of the error propagation into hydrological simulation.
Abstract: [1] The aim of this paper is to foster the development of an end-to-end uncertainty analysis framework that can quantify satellite-based precipitation estimation error characteristics and to assess the influence of the error propagation into hydrological simulation. First, the error associated with the satellite-based precipitation estimates is assumed as a nonlinear function of rainfall space-time integration scale, rain intensity, and sampling frequency. Parameters of this function are determined by using high-resolution satellite-based precipitation estimates and gauge-corrected radar rainfall data over the southwestern United States. Parameter sensitivity analysis at 16 selected 5° × 5° latitude-longitude grids shows about 12–16% of variance of each parameter with respect to its mean value. Afterward, the influence of precipitation estimation error on the uncertainty of hydrological response is further examined with Monte Carlo simulation. By this approach, 100 ensemble members of precipitation data are generated, as forcing input to a conceptual rainfall-runoff hydrologic model, and the resulting uncertainty in the streamflow prediction is quantified. Case studies are demonstrated over the Leaf River basin in Mississippi. Compared with conventional procedure, i.e., precipitation estimation error as fixed ratio of rain rates, the proposed framework provides more realistic quantification of precipitation estimation error and offers improved uncertainty assessment of the error propagation into hydrologic simulation. Further study shows that the radar rainfall-generated streamflow sequences are consistently contained by the uncertainty bound of satellite rainfall generated streamflow at the 95% confidence interval.

208 citations


Journal ArticleDOI
TL;DR: A formulation is presented for the impact of data limitations associated with the calibration of parameters for these models, on their overall predictive accuracy and a new method for the characterization of stochastic processes from corresponding experimental observations is obtained.

202 citations


Journal ArticleDOI
TL;DR: Capabilities of the ESSE system are illustrated in three data-assimilative applications: estimation of uncertainties for physical-biogeochemical fields, transfers of ocean physics uncertainties to acoustics, and real-time stochastic ensemble predictions with assimilation of a wide range of data types.

176 citations


Journal ArticleDOI
TL;DR: Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

166 citations


Journal ArticleDOI
TL;DR: In this article, the collective experience of scientists and engineers in the assessment of uncertainty associated with TMDL models is described, and the collective study concludes that a more scientific method to account for uncertainty would be to develop uncertainty probability distribution functions and transfer such uncertainties to TMDl load allocation through the margin of safety component.
Abstract: Although the U.S. Congress established the Total Maximum Daily Load (TMDL) program in the original Clean Water Act of 1972, Section 303(d), it did not receive attention until the 1990s. Currently, two methods are available for tracking pollution in the environment and assessing the effectiveness of the TMDL process on improving the quality of impaired water bodies: field monitoring and mathematical/computer modeling. Field monitoring may be the most appropriate method, but its use is limited due to high costs and extreme spatial and temporal ecosystem variability. Mathematical models provide an alternative to field monitoring that can potentially save time, reduce cost, and minimize the need for testing management alternatives. However, the uncertainty of the model results is a major concern. Uncertainty is defined as the estimated amount by which an observed or calculated value may depart from the true value, and it has important policy, regulatory, and management implications. The source and magnitude of uncertainty and its impact on TMDL assessment has not been studied in depth. This article describes the collective experience of scientists and engineers in the assessment of uncertainty associated with TMDL models. It reviews sources of uncertainty (e.g., input variability, model algorithms, model calibration data, and scale), methods of uncertainty evaluation (e.g., first-order approximation, mean value first-order reliability method, Monte Carlo, Latin hypercube sampling with constrained Monte Carlo, and generalized likelihood uncertainty estimation), and strategies for communicating uncertainty in TMDL models to users. Four case studies are presented to highlight uncertainty quantification in TMDL models. Results indicate that uncertainty in TMDL models is a real issue and should be taken into consideration not only during the TMDL assessment phase, but also in the design of BMPs during the TMDL implementation phase. First-order error (FOE) analysis and Monte Carlo simulation (MCS) or any modified versions of these two basic methods may be used to assess uncertainty. This collective study concludes that a more scientific method to account for uncertainty would be to develop uncertainty probability distribution functions and transfer such uncertainties to TMDL load allocation through the margin of safety component, which is selected arbitrarily at the present time. It is proposed that explicit quantification of uncertainty be made an integral part of the TMDL process. This will benefit private industry, the scientific community, regulatory agencies, and action agencies involved with TMDL development and implementation.

143 citations


Journal ArticleDOI
TL;DR: A probabilistic approach based on high-order accurate expansions of general stochastic processes is explored, enabling the computation of global sensitivities of measures of interest, e.g., radar-cross-sections in scattering applications, for a variety of types of uncertainties.
Abstract: We discuss computationally efficient ways of accounting for the impact of uncertainty, e.g., lack of detailed knowledge about sources, materials, shapes, etc., in computational time-domain electromagnetics. In contrast to classic statistical Monte Carlo--based methods, we explore a probabilistic approach based on high-order accurate expansions of general stochastic processes. We show this to be highly efficient and accurate on both one- and two-dimensional examples, enabling the computation of global sensitivities of measures of interest, e.g., radar-cross-sections (RCS) in scattering applications, for a variety of types of uncertainties.

132 citations


Journal ArticleDOI
TL;DR: Different computational methodologies have been developed to quantify the uncertain response of a relatively simple aeroelastic system in limit-cycle oscillation, subject to parametric variability, and are compared in terms of computational cost, convergence properties, ease of implementation, and potential for application to complex aeroElastic systems.

Journal ArticleDOI
TL;DR: This paper employs both a generalized polynomial chaos and Monte Carlo simulations to solve the transformed stochastic problem of transport of a passive scalar in Stokes' flow and to quantify the corresponding predictive uncertainty.

Journal ArticleDOI
TL;DR: A Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed data is examined.

Journal ArticleDOI
TL;DR: In this paper, the sensitivity of free surface velocity to variations in the uncertain inputs, to constrain the values of these inputs to be consistent with experiment, and to predict free surface velocities based on the constrained inputs.
Abstract: A y er plate experiment involves forcing a plane shock wave through stationary test samples of material and measuring the free surface velocity of the target as a function of time. These experiments are conducted to learn about the behavior of materials subjected to high strain rate environments. Computer simulations of y er plate experiments are conducted with a two-dimensional hydro- dynamic code developed under the Advanced Strategic Computing (ASC) program at Los Alamos National Laboratory. This code incorporates physical models that contain parameters having uncertain values. The objectives of the analyses pre- sented in this paper are to assess the sensitivity of free surface velocity to variations in the uncertain inputs, to constrain the values of these inputs to be consistent with experiment, and to predict free surface velocity based on the constrained inputs. We implement a Bayesian approach that combines detailed physics simulations with experimental data for the desired statistical inference (Kennedy and O'Hagan 2001; Higdon, Kennedy, Cavendish, Cafeo, and Ryne 2004). The approach given here allows for: uncertainty regarding model inputs (i.e. calibration); accounting for uncertainty due to limitations on the number of simulations that can be carried out; discrepancy between the simulation code and the actual physical system; and uncertainty in the observation process that yields the actual eld data on the true physical system.

Journal ArticleDOI
TL;DR: In this paper, the experimental identification and validation of a non-parametric probabilistic approach allowing model uncertainties and data uncertainties to be taken into account in the numerical model developed to predict low and medium-frequency dynamics of structures is performed for a composite sandwich panel representing a complex dynamical system.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a simple approach to augment the Flash Flood Guidance System (FFGS) with uncertainty propagation components to enable informed decision making by those responsible for operation and management of natural hazard protection systems.
Abstract: Quantifying uncertainty associated with flash flood warning or forecast systems is required to enable informed decision making by those responsible for operation and management of natural hazard protection systems. The current system used by the U.S. National Weather Service (NWS) to issue flash-flood warnings and watches over the Unites States is a purely deterministic system. The authors propose a simple approach to augment the Flash Flood Guidance System (FFGS) with uncertainty propagation components. The authors briefly discuss the main components of the system, propose changes to improve it, and allow accounting for several sources of uncertainty. They illustrate their discussion with examples of uncertainty quantification procedures for several small basins of the Illinois River basin in Oklahoma. As the current FFGS is tightly coupled with two technologies, that is, threshold-runoff mapping and the Sacramento Soil Moisture Accounting Hydrologic Model, the authors discuss both as sources of...

Proceedings ArticleDOI
01 Jan 2006
TL;DR: In this article, the authors describe collective experience of scientists/engineers in the assessment of uncertainty associated with TMDL models, and highlight the importance of quantification of uncertainty quantification in TMDl models.
Abstract: Although the U.S. Congress established the Total Maximum Daily Load (TMDL) program in the original Clean Water Act of 1972, Section 303(d), it did not receive attention until the 1990s. Currently, two methods are available for tracking pollution in the environment and assessing the effectiveness of the TMDL process on improving the quality of impaired water bodies: field monitoring and mathematical/computer modeling. Field monitoring may be the most appropriate method, but its use is limited due to high cost and extreme spatial and temporal ecosystem variability. Mathematical models provide an alternative to field monitoring that can potentially save time, reduce cost, and minimize the need for testing management alternatives. However, the uncertainty of the model results is a major concern. The issue of uncertainty has important policy, regulatory, and management implications, but understanding the source and magnitude of uncertainty and its impact on TMDL assessment has not been studied in depth. This paper describes collective experience of scientists/engineers in the assessment of uncertainty associated with TMDL models. It reviews sources of uncertainty (e.g., input variability, model algorithms, model calibration data, and scale), methods of uncertainty evaluation (e.g., First Order Approximation, Mean Value First Order Reliability Method, Monte Carlo, Latin Hypercube Sampling with Constrained Monte Carlo, and Generalized Likelihood Uncertainty Estimation), and strategy for communicating uncertainty in TMDL models to users. Four case studies are presented to highlight uncertainty quantification in TMDL models. Results indicate that uncertainty in TMDL models is a real issue and should be taken into consideration not only during the TMDL assessment phase, but also in the design of BMPs during the TMDL implementation phase. First Order Error (FOE) analysis and Monte Carlo Simulation (MCS) or any modified versions of these two basic methods may be used to assess uncertainty. This collective study concludes that the best method to account for uncertainty would be to develop uncertainty probability distribution functions and transfer such uncertainties to TMDL load allocation through the margin of safety component, which is selected arbitrarily at the present time. It is proposed that explicit quantification of uncertainty be made an integral part of the TMDL process. This will benefit private industry, scientific community, regulatory agencies, and action agencies involved with TMDL development and implementation.

Journal ArticleDOI
TL;DR: In this paper, the authors describe an initial investigation into response surface based uncertainty quantification using both kriging and multivariate adaptive regression spline surface approximation methods, and the impact of two different data sampling methods, Latin hypercube sampling and orthogonal array sampling, is also examined.
Abstract: Conventional sampling-based uncertainty quantification (UQ) methods involve generating large numbers of random samples on input variables and calculating output statistics by evaluating the computational model for each set of samples. For real world applications, this method can be computationally prohibitive due to the cost of the model and the time required for each simulation run. Using response surface approximations may allow for the output statistics to be estimated more accurately when only a limited number of simulation runs are available. This paper describes an initial investigation into response surface based UQ using both kriging and multivariate adaptive regression spline surface approximation methods. In addition, the impact of two different data sampling methods, Latin hypercube sampling and orthogonal array sampling, is also examined. The data obtained from this study indicate that caution should be exercised when implementing response surface based methods for UQ using very low sample siz...

Journal ArticleDOI
TL;DR: The main goal of this paper is to design an efficient sampling technique for dynamic data integration using the Langevin algorithms based on a coarse-scale model of the problem, and it is proved that the modified Markov chain converges to the correct posterior distribution.

Journal ArticleDOI
TL;DR: In this article, both parametric and non-parametric probabilistic approaches are used on a complex system of aerospace engineering constituted of a satellite coupled with its launch vehicle to quantify the sensitivity of the structure to data uncertainties as well as model uncertainties.

Journal ArticleDOI
TL;DR: Sensitivity analysis for quantified uncertainty in evidence theory is developed in this article, where interval information is assumed for the best representation of imprecise information, and the sensitivity analysis of plausibility is analytically derived with respect to expert opinions and structural parameters.
Abstract: Sensitivity analysis for the quantified uncertainty in evidence theory is developed. In reliability quantification, classical probabilistic analysis has been a popular approach in many engineering disciplines. However, when we cannot obtain sufficient data to construct probability distributions in a large-complex system, the classical probability methodology may not be appropriate to quantify the uncertainty. Evidence theory, also called Dempster–Shafer Theory, has the potential to quantify aleatory (random) and epistemic (subjective) uncertainties because it can directly handle insufficient data and incomplete knowledge situations. In this paper, interval information is assumed for the best representation of imprecise information, and the sensitivity analysis of plausibility in evidence theory is analytically derived with respect to expert opinions and structural parameters. The results from the sensitivity analysis are expected to be very useful in finding the major contributors for quantified uncertainty and also in redesigning the structural system for risk minimization.

ReportDOI
01 Jun 2006
TL;DR: In this paper, a case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs.
Abstract: Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

Proceedings ArticleDOI
01 Jan 2006
TL;DR: In this paper, a unified uncertainty analysis that deals with both aleatory and epistemic uncertainties is investigated, where the input parameters with aleatory uncertainty are modeled with probability distributions by probability theory, and the inputs with epistemic uncertainty are model with basic probability assignment by evidence theory.
Abstract: Both aleatory and epistemic uncertainties exist in engineering applications. Aleatory uncertainty (objective or stochastic uncertainty) describes the inherent variation associated with a physical system or environment. Epistemic uncertainty, on the other hand, is derived from some level of ignorance or incomplete information about a physical system or environment. Aleatory uncertainty associated with parameters is usually modeled by probability theory and has been widely researched and applied by industry, academia, and government. The study of epistemic uncertainty in engineering has recently started. The feasibility of the unified uncertainty analysis that deals with both types of uncertainties is investigated in this paper. The input parameters with aleatory uncertainty are modeled with probability distributions by probability theory, and the input parameters with epistemic uncertainty are modeled with basic probability assignment by evidence theory. The effect of the mixture of both aleatory and epistemic uncertainties on the model output is modeled with belief and plausibility measures (or the lower and upper probability bounds). It is shown that the calculation of belief measure or plausibility measure can be converted to the calculation of the minimum or maximum probability of failure over each of the mutually exclusive subsets of the input parameters with epistemic uncertainty. A First Order Reliability Method (FORM) based algorithm is proposed to conduct the unified uncertainty analysis. Two examples are given for the demonstration. Future research directions are derived from the discussions in this paper.© 2006 ASME

Journal ArticleDOI
TL;DR: This paper suggests an approximate method for sensitivity analysis based on particular one-dimensional or single-loop sampling procedures, which require substantially less computational effort and are comparable to the full two-dimensional approach.

Journal ArticleDOI
TL;DR: In this paper, the authors used Monte-Carlo simulation to assess the reliability of complex structural assemblies, even for extremely high levels of reliability, using a refined finite element model (120,000 DOF).

Journal ArticleDOI
TL;DR: In this article, an approach to uncertainty quantification for nuclear applications that combines the covariance evaluation of differential cross-section data and the error propagation from matching a matching process is presented.
Abstract: We present an approach to uncertainty quantification for nuclear applications that combines the covariance evaluation of differential cross-section data and the error propagation from matching a cr...

Proceedings ArticleDOI
01 May 2006
TL;DR: The purpose of this study was to address the question: Does a particular RSA type perform better (in terms of a better fit) if a particular sampling method is used?
Abstract: Response surface approximations (RSA) are often used as inexpensive replacements for computationally expensive computer simulations. Once a RSA has been computed, it is cheap to evaluate this “meta-model” or surrogate, and thus the RSA is often used in a variety of contexts, including optimization and uncertainty quantification. Usually, some method of sampling points over the input domain is used to generate samples of the input variables. These samples are run through the computer simulation. A response surface approximation is then generated based on the sample points. This report presents a study investigating the dependency of the response surface method on the sampling type. The purpose of this study was to address the question: Does a particular RSA type perform better (in terms of a better fit) if a particular sampling method is used? The RSA types examined were kriging, polynomial regression, and multivariate adaptive regression splines (MARS). The sampling types examined were Latin Hypercube, Halton, Hammersley, Centroidal Voronoi Tesselation (CVT), and standard Monte Carlo sampling. The example problems were a 5-dimensional version of Rosenbrock’s function and the Paviani function. RSA of the three response surface types were developed based on the five sampling methods. Performance was compared using ANOVA techniques.

Journal ArticleDOI
TL;DR: Stable statistics for the solution and for the error are obtained after suitable averaging procedures after suitable averaged procedures for chaotic flow simulations.

Journal ArticleDOI
01 Mar 2006-Area
TL;DR: In this paper, an off-the-shelf conceptual rainfall runoff model, HYSIM, is applied to a suite of catchments throughout Ireland in preparation for use in climate impact assessment.
Abstract: Much uncertainty is derived from the application of conceptual rainfall runoff models. In this paper, HYSIM, an ‘off-the-shelf’ conceptual rainfall runoff model, is applied to a suite of catchments throughout Ireland in preparation for use in climate impact assessment. Parameter uncertainty is assessed using the GLUE methodology. Given the lack of source code available for the model, parameter sampling is carried out using Latin hypercube sampling. Uncertainty bounds are constructed for model output. These bounds will be used to quantify uncertainty in future simulations as they include error derived from data measurement, model structure and parameterization.

Journal ArticleDOI
TL;DR: In this paper, a non-parametric probabilistic method is applied to construct the random matrix model allowing model errors and data errors to be taken into account, and an extension of the nonparametric method for the case of nonhomogeneous model errors through the structure is presented.