scispace - formally typeset
Search or ask a question

Showing papers on "Uncertainty quantification published in 1999"


Journal ArticleDOI
TL;DR: The characterization and treatment of uncertainty poses special challenges when modeling indeterminate or complex coupled systems such as those involved in the interactions between human activity, climate and the ecosystem as mentioned in this paper.
Abstract: The characterization and treatment of uncertainty poses special challenges when modeling indeterminate or complex coupled systems such as those involved in the interactions between human activity, climate and the ecosystem. Uncertainty about model structure may become as, or more important than, uncertainty about parameter values. When uncertainty grows so large that prediction or optimization no longer makes sense, it may still be possible to use the model as a ‘‘behavioral test bed’’ to examine the relative robustness of alternative observational and behavioral strategies. When models must be run into portions of their phase space that are not well understood, different submodels may become unreliable at different rates. A common example involves running a time stepped model far into the future. Several strategies can be used to deal with such situations. The probability of model failure can be reported as a function of time. Possible alternative ‘‘surprises’’ can be assigned probabilities, modeled separately, and combined. Finally, through the use of subjective judgments, one may be able to combine, and over time shift between models, moving from more detailed to progressively simpler order-of-magnitude models, and perhaps ultimately, on to simple bounding analysis.

76 citations


Proceedings ArticleDOI
01 Jan 1999
TL;DR: In this article, the authors introduce the concept of a quality map, which is a two-dimensional representation of the reservoir responses and their uncertainties, and apply it to compare reservoirs, to rank stochastic realizations and to incorporate reservoir characterization uncertainty into decision making, such as choosing well locations, with fewer full field simulation runs.
Abstract: The parameters that govern fluid flow through heterogeneous reservoirs are numerous and uncertain. Even when it is possible to visualize all the parameters together, the complex and non-linear interaction between them makes it difficult to predict the dynamic reservoir responses to production. A flow simulator may be used to evaluate the responses and make reservoir management decisions, but normally only one deterministic set of parameters is considered and no uncertainty is associated with the responses or taken into account for the decisions. This paper introduces the concept of a quality map, which is a two-dimensional representation of the reservoir responses and their uncertainties. The quality concept may be applied to compare reservoirs, to rank stochastic realizations and to incorporate reservoir characterization uncertainty into decision making, such as choosing well locations, with fewer full field simulation runs. The data points necessary to generate the quality map are obtained by running a flow simulator with a single well and varying the location of the well in each run to have a good coverage of the entire horizontal grid. The quality for each position of the well is the cumulative oil production after a long time of production. The geological model uncertainty is captured by multiple stochastic realizations. One quality map is generated for each realization and the difference between the realization maps is a measure of the uncertainty in the flow responses. For each cell, the lower quartile of the local distribution of quality is extracted to build a map, which can be used for decision making accounting for uncertainty. The methodology for building the quality map is presented in detail and the applications of the map are demonstrated with fifty realistic reservoir models.

62 citations


Journal ArticleDOI
O. J. Lepine, R. C. Bissell, S.I. Aanonsen1, I. Pallister, J. W. Barker 
TL;DR: In this article, the authors demonstrate how gradient-based techniques can be used to estimate uncertainty in predictive reservoir simulations made following history-matching, and discuss how the gradient calculations can help in finding the parameters that make the largest contributions to the uncertainty.
Abstract: We demonstrate how gradient-based techniques can be used to estimate uncertainty in predictive reservoir simulations made following history-matching. We discuss how the gradient calculations can help in finding the parameters that make the largest contributions to the uncertainty, estimating the uncertainty on these parameters, and estimating the uncertainty in predicted quantities by use of a linear analysis. The methodology is illustrated on two field examples, and compared with other methods of uncertainty quantification.

37 citations


Journal ArticleDOI
TL;DR: Fundamentals are presented of two Bayesian methods for producing a probabilistic forecast via any deterministic model, and the Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.
Abstract: Rational decision making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand Fundamentals are presented of two Bayesian methods for producing a probabilistic forecast via any deterministic model The Bayesian Processor of Forecast (BPF) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution The BFS is compared with Monte Carlo simulation and “ensemble forecasting” technique, none of which can alone produce a probabilistic forecast that quantifies the total uncertainty, but each can serve as a component of the BFS

29 citations



Journal ArticleDOI
TL;DR: A methodology for estimating the risk owing to the phenomenon of boiling liquid expanding vapour explosion (BLEVE) in the presence of uncertainties both in the model and in the parameters of the models is presented and one conclusion is that the uncertainties in the probability of loss of life are mainly due to the model of the physical phenomenon rather than to the uncertainties of the dose-response model.

23 citations


Journal ArticleDOI
TL;DR: In this article, a small-break loss-of-coolant accident with the break in the cold leg of a Westinghouse-type two-loop pressurized water reactor was selected for the analysis, and the CSAU methodology was used for uncertainty quantification.
Abstract: When best-estimate calculations are performed, the uncertainties need to be quantified. Worldwide, various methods have been proposed for this quantification. Rather than proposing a new uncertainty methodology, a contribution is made to the existing code scaling, applicability, and uncertainty (CSAU) method. A small-break loss-of-coolant accident with the break in the cold leg of a Westinghouse-type two-loop pressurized water reactor was selected for the analysis, and the CSAU methodology was used for uncertainty quantification. The uncertainty was quantified for the RELAP5/MOD3.2 thermal-hydraulic computer code. Some tools suggested by the uncertainty methodology based on accuracy extrapolation (UMAE) method were successfully applied to improve the CSAU methodology, particularly for nodalization qualification. A critical scenario with core uncovery was selected for the analysis, which showed that when uncertainty is added to the peak cladding temperature, the safety margin is sufficient. The tools developed by the UMAE method showed that the structure of the CSAU method is universal because it does not prescribe tools for the analysis.

23 citations


Journal ArticleDOI
TL;DR: A component importance ranking is proposed on the basis of the expected value of perfect information about the reliability of each component to determine the optimum component testing scheme prior to deciding on the system's certification.

15 citations


Journal ArticleDOI
TL;DR: In this article, an optimal statistical estimator (OSE) algorithm is adapted, extended, and used for response surface generation to demonstrate the algorithm's applicability to evaluating uncertainties in single-value or time-dependent parameters.
Abstract: When best-estimate calculations are performed, uncertainty needs to be quantified. An optimal statistical estimator (OSE) algorithm is adapted, extended, and used for response surface generation to demonstrate the algorithm's applicability to evaluating uncertainties in single-value or time-dependent parameters. A small-break loss-of-coolant accident with the break in the cold leg of a two-loop pressurized water reactor is selected for analysis. The code scaling, applicability, and uncertainty (CSAU) method was used for uncertainty quantification. The uncertainty was quantified for the RELAP5/ MOD3.2 thermal-hydraulic computer code. The study shows that an OSE can be efficiently used instead of regression analysis for response surface generation. With the OSE, optimal information obtained from the code calculation is used for response surface generation. This finding indicates that by increasing the number of code calculations, one increases the confidence level of the uncertainty bounds. Increasing the number of calculations also results in convergence of the peak cladding temperature, As uncertainty can be evaluated for time-dependent parameters, the OSE tool makes the CSAU method universal for evaluating uncertainties of transients other than those of a loss-of-coolant accident.

10 citations


Book ChapterDOI
TL;DR: In this paper, a structured approach for uncertainty quantification that uses a generic description of the life prediction process is presented, based on an approximate error propagation theory combined with a unified treatment of random and systematic errors.
Abstract: A failure in an aircraft jet engine can have severe consequences which cannot be accepted and high requirements are therefore raised on engine reliability. Consequently, assessment of the reliability of life predictions used in design and maintenance are important. To assess the validity of the predicted life a method to quantify the contribution to the total uncertainty in the life prediction from different uncertainty sources is developed. The method is a structured approach for uncertainty quantification that uses a generic description of the life prediction process. It is based on an approximate error propagation theory combined with a unified treatment of random and systematic errors. The result is an approximate statistical distribution for the predicted life. The method is applied on life predictions for three different jet engine components. The total uncertainty became of reasonable order of magnitude and a good qualitative picture of the distribution of the uncertainty contribution from the different sources was obtained. The relative importance of the uncertainty sources differs between the three components. It is also highly dependent on the methods and assumptions used in the life prediction. Advantages and disadvantages of this method is discussed.

2 citations