scispace - formally typeset
Search or ask a question

Showing papers on "Uncertainty quantification published in 2001"


Journal ArticleDOI
TL;DR: In this paper, the authors describe the construction and implementation of a stochastic Navier-Stokes solver, which combines a spectral uncertainty representation scheme with a finite difference projection method for flow simulation.

413 citations


Journal ArticleDOI
TL;DR: An explicit construction and representation of the probability model have been obtained and are very well suited to algebraic calculus and to Monte Carlo numerical simulation in order to compute the transient responses of structures submitted to impulsive loads.
Abstract: A new approach is presented for analyzing random uncertainties in dynamical systems. This approach consists of modeling random uncertainties by a nonparametric model allowing transient responses of mechanical systems submitted to impulsive loads to be predicted in the context of linear structural dynamics. The information used does not require the description of the local parameters of the mechanical model. The probability model is deduced from the use of the entropy optimization principle, whose available information is constituted of the algebraic properties related to the generalized mass, damping, and stiffness matrices which have to be positive-definite symmetric matrices, and the knowledge of these matrices for the mean reduced matrix model. An explicit construction and representation of the probability model have been obtained and are very well suited to algebraic calculus and to Monte Carlo numerical simulation in order to compute the transient responses of structures submitted to impulsive loads. The fundamental properties related to the convergence of the stochastic solution with respect to the dimension of the random reduced matrix model are analyzed. Finally, an example is presented.

336 citations


Proceedings ArticleDOI
11 Jun 2001
TL;DR: Strengths and weaknesses of evidence theory are discussed, and several important open issues are identified that must be addressed before evidence theory can be used successfully in engineering applications.
Abstract: As widely done in the risk assessment community, a distinction is made between aleatory (random) and epistemic (subjective) uncertainty in the modeling and simulation process. The nature of epistemic uncertainty is discussed, including (1) occurrence in parameters contained in mathematical models of a system and its environment, (2) limited knowledge or understanding of a physical process or interactions of processes in a system, and (3) limited knowledge for the estimation of the likelihood of event scenarios of a system. To clarify the options available for representation of epistemic uncertainty, an overview is presented of a hierarchy of theories of uncertainty. Modern theories of uncertainty can represent much weaker statements of knowledge and more diverse types of uncertainty than traditional probability theory. A promising new theory, evidence (Dempster-Shafer) theory, is discussed and applied to a simple system given by an algebraic equation with two uncertain parameters. Multiple sources of information are provided for each parameter, but each source only provides an interval value for each parameter. The uncertainty in the system response is estimated using probability theory and evidence theory. The resultant solutions are compared with regard to their assessment of the likelihood that the system response exceeds a specified failure level. In this example, a traditional application of probability theory results in a significantly lower estimate of risk of failure as compared to evidence theory. Strengths and weaknesses of evidence theory are discussed, and several important open issues are identified that must be addressed before evidence theory can be used successfully in engineering applications. * Distinguished Member Technical Staff, Associate Fellow t Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the U. S. Department of Energy under contract No. DEAC04-94AL85000. This paper is declared a work of the U.S. Government and is not subject to copyright protection in the United States. Kari Sentz Systems Science and Industrial Engineering State University of New York-Binghamton Binghamton, New York

261 citations


Journal ArticleDOI
20 Jul 2001-Science
TL;DR: The uncertainty assessment process of the IPCC should be improved in the future by using a consistent approach to quantifying uncertainty, focusing the quantification on the few key results most important for policy making as discussed by the authors.
Abstract: Clear and quantitative discussion of uncertainties is critical for public policy making on climate change. The recently completed report of the Intergovernmental Panel on Climate Change assessed the uncertainty in its findings and forecasts. The uncertainty assessment process of the IPCC should be improved in the future by using a consistent approach to quantifying uncertainty, focusing the quantification on the few key results most important for policy making. The uncertainty quantification procedure should be fully documented, and if expert judgment is used, a specific list of the experts consulted should be included.

190 citations


Proceedings ArticleDOI
01 Jan 2001
TL;DR: In this paper, the uncertainties associated with well placement were addressed within the utility-theory framework using numerical simulation as the evaluation tool, and a hybrid genetic algorithm (HGA) was used for optimization.
Abstract: Summary Determining the best location for new wells is a complex problem that depends on reservoir and fluid properties, well and surfaceequipment specifications, and economic criteria. Numerical simulation is often the most appropriate tool to evaluate the feasibility of well configurations. However, because the data used to establish numerical models have uncertainty, so do the model forecasts. The uncertainties in the model reflect themselves in the uncertainties of the outcomes of well-configuration decisions. We never possess the true and deterministic information about the reservoir, but we may have geostatistical realizations of the truth constructed from the information available. An approach that can translate the uncertainty in the data to uncertainty in the wellplacement decision in terms of monetary value was developed in this study. The uncertainties associated with well placement were addressed within the utility-theory framework using numerical simulation as the evaluation tool. The methodology was evaluated by use of the Production forecasting with UNcertainty Quantification (PUNQ)-S3 model, which is a standard test case that was based on a real field. Experiments were carried out on 23 historymatched realizations, and a truth case was also available. The results were verified by comparison to exhaustive simulations. Utility theory not only offered the framework to quantify the influence of uncertainties in the reservoir description in terms of monetary value, but it also provided the tools to quantify the otherwise arbitrary notion of the risk attitude of the decision maker. A hybrid genetic algorithm (HGA) was used for optimization. In addition, a computationally cheaper alternative was also investigated. The well-placement problem was formulated as the optimization of a random function. The genetic algorithm (GA) was used as the optimization tool. Each time a well configuration was to be evaluated, a different realization of the reservoir properties was selected randomly from the set of realizations, all of which honored the geologic and dynamic data available from the reservoir. Numerical simulation was then carried out with this randomly selected realization to calculate the objective function value. This approach has the potential to incorporate the risk attitudes of the decision maker and was observed to be approximate but computationally feasible.

99 citations


ReportDOI
01 Apr 2001
TL;DR: This report serves as a reference manual for the commands specification for the Dakota software, providing input overviews, option descriptions, and example specifications.
Abstract: The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogatebased optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the Dakota software, providing input overviews, option descriptions, and example specifications.

80 citations


Journal ArticleDOI
TL;DR: In this article, a nonparametric probability model for modeling random uncertainties by entropy optimization is proposed, which allows transient responses of mechanical systems submitted to impulsive loads to be predicted in the context of linear structural dynamics.

42 citations


Journal ArticleDOI
TL;DR: General fuzzy-stochastic risk-based decision analysis methodology is provided and fuzzy set quantification of MOB subjective information is demonstrated and Fuzzy-Bayesian updating of the state-of-knowledge of Mob cost and schedule information as it is piecewise accumulated is discussed.

26 citations


ReportDOI
01 Aug 2001
TL;DR: This publication overviews a project at Los Alamos National Laboratory that aims at developing a methodology for quantifying uncertainty and assessing the total predictability of structural dynamics simulations.
Abstract: Uncertainty quantification is an emergent field in engineering mechanics that makes use of statistical sampling, hypothesis testing and input-output effect analysis to characterize the effect that parametric and non-parametric uncertainty has on physical experiment or numerical simulation output. This publication overviews a project at Los Alamos National Laboratory that aims at developing a methodology for quantifying uncertainty and assessing the total predictability of structural dynamics simulations. The propagation of parametric variability through numerical simulations is discussed. Uncertainty assessment is also a critical component of model validation, where the total error between physical observation and model prediction must be characterized. The purpose of model validation is to assess the extent to which a model is an appropriate representation of reality, given the purpose intended for the numerical simulation and its domain of applicability. The discussion is illustrated with component-level and system-level validation experiments that feature the response of nonlinear models to impulse excitation sources. This publication is unclassified; it has been approved for unlimited, public release (number LA-UR-01-3828).

22 citations


01 Jan 2001
TL;DR: In this article, a method based on Bayesian statistics is presented, where the prior distribution representing information about the uncertainty of the statistical parameters can be updated to the posterior distribution as soon as data becomes available.
Abstract: Probabilistic design of structures is usually based on estimates of a design load with a high average return period. Design loads are often estimated using classical statistical methods. A shortcoming of this approach is that statistical uncertainties are not taken into account. In this paper, a method based on Bayesian statistics is presented. Using Bayes’ theorem, the prior distribution representing information about the uncertainty of the statistical parameters can be updated to the posterior distribution as soon as data becomes available. Seven predictive probability distributions are considered for determining extreme quantiles of loads: the exponential, Rayleigh, normal, lognormal, gamma, Weibull and Gumbel. The Bayesian method has been successfully applied to estimate the design discharge of the river Rhine while taking account of the statistical uncertainties involved. As a prior the non-informative Jeffreys prior was chosen. The Bayes estimates are compared to the classical maximum-likelihood estimates. Furthermore, so-called Bayes factors are used to determine weights corresponding to how well a probability distribution fits the observed data; that is, the better the fit, the higher the weighting. 2 STATISTICAL UNCERTAINTIES According to (amongst others) Slijkhuis et al. (1999) and Siu & Kelly (1998), uncertainties in risk analysis can primarily be divided into two categories: inherent uncertainties and epistemic uncertainties. Inherent uncertainties represent randomness or variability in nature. For example, even in the event of sufficient data, one cannot predict the maximum discharge that will occur next year. The two main types of inherent uncertainty are inherent uncertainty in time (e.g., fluctuation of the discharge in time) and inherent uncertainty in space (e.g., fluctuation of a dike height in space). It is not possible to reduce inherent uncertainty in time. Epistemic uncertainties represent the lack of knowledge about a (physical) system. The two main types of epistemic uncertainty are statistical uncertainty (due to lack of sufficient data) and model uncertainty (due to lack of understanding the physics). Statistical uncertainty can be parameter uncertainty (when the parameters of the distribution are unknown) and distribution type uncertainty (when the type of distribution is unknown). In principle, epistemic uncertainties can be reduced as knowledge increases and more data becomes available. 3 BAYESIAN ESTIMATION The only statistical theory which combines modelling inherent uncertainty and statistical uncertainty is Bayesian statistics. The theorem of Bayes (1763) provides a solution to the problem of how to learn from data. In the framework of estimating the parameters θ = ( ,..., ) θ θ 1 d of a probability distribution ( | ) x θ , Bayes’ theorem can be written as

10 citations



01 May 2001
TL;DR: The intimate relationship between modeling and uncertainty is explored by defining uncertainty as an integrate part of the model, not just parametric variability or the lack of knowledge about the physical system being investigated.
Abstract: This publication addresses the issues of modeling, uncertainty quantification, model validation and numerical predictability. With the increasing role of numerical simulation in science, technology as well as every day decision-making, assessing the predictive accuracy of computer models becomes essential. Conventional approaches such as finite element model updating or Bayesian inference are undeniably useful tools but they do not fully answer the question: How accurately does the model represent reality? First, the evolution of scientific computing and consequences in terms of modeling and analysis practices are discussed. The intimate relationship between modeling and uncertainty is explored by defining uncertainty as an integrate part of the model, not just parametric variability or the lack of knowledge about the physical system being investigated. Examples from nuclear physics and structural dynamics are provided to illustrate issues related to uncertainty, validation and predictability. Finally, feature extraction or the characterization of the dynamics of interest from time series is discussed.


01 Mar 2001
TL;DR: A paradigm from the field of statistical pattern recognition has been adopted, which generalises the extraction of corresponding “features” from the experimental data and the simulated data, and treats the comparison of these sets of features as a statistical test.
Abstract: The field of computational structural dynamics is on the threshold of revolutionary change. The ever-increasing costs of physical experiments coupled with advances in massively parallel computer architecture are steering the engineering analyst to be more and more reliant on numerical calculations with little to no data available for experimental confirmation. New areas of research in engineering analysis have come about as a result of the changing roles of computations and experiments. Whereas in the past the primary function of physical experiments has been to confirm or “prove” the accuracy of a computational simulation, the new environment of engineering is forcing engineers to allocate precious experimental resources differently. Rather than trying to “prove” whether a calculation is correct, the focus is on learning how to use experimental data to “improve” the accuracy of computational simulations. This process of improving the accuracy of calculations through the use of experimental data is termed “model validation.” Model validation emphasises the need for quantitative techniques of assessing the accuracy of a computational prediction with respect to experimental measurements, taking into account that both the prediction and the measurement have uncertainties associated with them. The “vugraph norm,” where one overlays transparencies of simulated data and experimental data in an attempt to show consistency, is no longer an adequate means of demonstrating validity of predictions. To approach this problem, a paradigm from the field of statistical pattern recognition has been adopted [1]. This paradigm generalises the extraction of corresponding “features” from the experimental data and the simulated data, and treats the comparison of these sets of features as a statistical test. The parameters that influence the output of the simulation (such as equation parameters, initial and boundary conditions, etc.) can then be adjusted to minimise the distance between the data sets as measured via the statistical test. However, the simple adjustment of parameters to calibrate the simulation to the test data does not fully accomplish the goal of “improving” the ability to model effectively, as there is no indication that the model will maintain accuracy at any other experimental data points. Effective model validation requires “uncertainty quantification” to ensure that the adequate agreement achieved between the numerical prediction and the experimental measurement is robust to changes in the experimental conditions. Uncertainty quantification refers to the exploration and understanding of the sources of uncertainty in a simulation: • Solution uncertainties, such as errors introduced by spatial and temporal discretization, as well as model form errors

Journal ArticleDOI
TL;DR: The neighbourhood algorithm (NA) is a recently proposed direct search (i.e. derivative free) approach to nonlinear inversion which is finding increasing numbers of applications from problems in earthquake seismology to production uncertainty quantification in oil reservoirs.
Abstract: The neighbourhood algorithm (NA) is a recently proposed direct search (i.e. derivative free) approach to nonlinear inversion which is finding increasing numbers of applications from problems in earthquake seismology to production uncertainty quantification in oil reservoirs. An inverse problem occurs whenever data only indirectly constrain some physical, chemical or parameters of interest. For example when seismic data, collected at the Earth?s surface is used to constrain structure at depth. Inverse problems occur in many areas of the physical and mathematical sciences. The NA is applicable in cases where the relationship between unknowns and observations is highly nonlinear and simple derivative calculations are undesirable, or impossible. NA is in the same class of technique as Genetic Algorithms (GA) and Simulated Annealing (SA), which are often associated with global optimization. The NA makes use of simple geometrical concepts, and requires just two tuning parameters, but has been shown to produce a sophisticated `self-adaptive' search behaviour in multi-dimensional parameter spaces. It also allows a fully nonlinear estimation of uncertainty in unknowns arising from noise, or other uncertainties, in the data.

01 Apr 2001
TL;DR: A previously developed model updating technique for nonlinear structural dynamics models is applied to data from repeated experimental trials to estimate the distributions of four key input parameters for a transient impact event.
Abstract: Finite element model validation is a topic of current interest to many researchers in the field of linear and nonlinear structural dynamics. Model validation refers to ''substantiation that a model, within its domain of applicability, possesses a satisfactory range of accuracy consistent with the intended application of the model. [1]. Validation is accomplished primarily by comparison of simulation results to experimental results to confirm the accuracy of the mechanics models in the simulation and the values of the parameters employed in the simulation, and to explore how the simulation might be improved. The assessment of uncertainties in the simulation mechanics models and their associated parameters plays a critical role in the credible validation of nonlinear structural dynamics models. The study of the effects that these uncertainties produce is termed uncertainty quantification (UQ). A major issue in UQ is the determination of how the distributions of the model parameters (which essentially form a set of inputs to the simulation) should be represented in order to accurately reflect the real-world response of the structure. In the case of repeated experiments, it is sometimes adequate to monitor the values of the input variables (e.g. forces, temperatures, velocities, etc.) and estimate a distribution from these observations. However, in many structural dynamics experiments, there can be significant input variables that are either unmeasurable (such as the actual orientation of parts during an impact event) or unmeasured (such as the level of torque applied to an interface during assembly). In these cases, it is necessary to estimate the distributions of the key input variables by indirect means. In this paper, a previously developed model updating technique for nonlinear structural dynamics models is applied to data from repeated experimental trials to estimate the distributions of four key input parameters for a transient impact event. The model updating technique itself, along with the selection of the key simulation parameters, is not the focus of this paper, and so these issues are only addressed in summary form.