scispace - formally typeset
Search or ask a question

Showing papers on "Uncertainty quantification published in 2003"


01 Apr 2003
TL;DR: In this paper, the authors developed a stochastic flood risk model consisting of simplified model components associated with the components of the process chain and used them for the risk and uncertainty analysis in a Monte Carlo framework.
Abstract: . Flood disaster mitigation strategies should be based on a comprehensive assessment of the flood risk combined with a thorough investigation of the uncertainties associated with the risk assessment procedure. Within the "German Research Network of Natural Disasters" (DFNK) the working group "Flood Risk Analysis" investigated the flood process chain from precipitation, runoff generation and concentration in the catchment, flood routing in the river network, possible failure of flood protection measures, inundation to economic damage. The working group represented each of these processes by deterministic, spatially distributed models at different scales. While these models provide the necessary understanding of the flood process chain, they are not suitable for risk and uncertainty analyses due to their complex nature and high CPU-time demand. We have therefore developed a stochastic flood risk model consisting of simplified model components associated with the components of the process chain. We parameterised these model components based on the results of the complex deterministic models and used them for the risk and uncertainty analysis in a Monte Carlo framework. The Monte Carlo framework is hierarchically structured in two layers representing two different sources of uncertainty, aleatory uncertainty (due to natural and anthropogenic variability) and epistemic uncertainty (due to incomplete knowledge of the system). The model allows us to calculate probabilities of occurrence for events of different magnitudes along with the expected economic damage in a target area in the first layer of the Monte Carlo framework, i.e. to assess the economic risks, and to derive uncertainty bounds associated with these risks in the second layer. It is also possible to identify the contributions of individual sources of uncertainty to the overall uncertainty. It could be shown that the uncertainty caused by epistemic sources significantly alters the results obtained with aleatory uncertainty alone. The model was applied to reaches of the river Rhine downstream of Cologne.

379 citations


Journal ArticleDOI
TL;DR: In this paper, a spectral formalism has been developed for the non-intrusive analysis of parametric uncertainty in reacting-flow systems, which quantifies the extent, dependence and propagation of uncertainty through the model system and allows the correlation of uncertainties in specific parameters to the resulting uncertainty in detailed flame structure.

334 citations


01 Jan 2003
TL;DR: A survey of uncertainty definitions and classifications from various fields can be found in this paper, where a classification of uncertainty for the design and development of complex systems is presented, which delineates ambiguity, epistemic, aleatory, and interaction uncertainty.
Abstract: Uncertainty plays a critical role in the analysis for a wide and diverse set of fields from economics to engineering. The term ‘uncertainty’ has come to encompass a multiplicity of concepts. This paper begins with a literature survey of uncertainty definitions and classifications from various fields. A classification of uncertainty for the design and development of complex systems follows. The various classifications are more practical than theoretical: to make distinct the techniques used to address each type of uncertainty and to demonstrate the effects of each type of uncertainty in each field. The classification for the design and development of complex systems delineates ambiguity, epistemic, aleatory, and interaction uncertainty. Epistemic uncertainty is further subdivided into modelform, phenomenological, and behavioral uncertainty, each of which is described in detail. The uncertainty taxonomy presented is an integral part of ongoing research into propagating and mitigating the effect of all types of uncertainty in the design and development of complex multidisciplinary engineering systems.

99 citations


Journal ArticleDOI
TL;DR: Evidence theory is proposed to handle the epistemic uncertainty that stems from lack of knowledge about a structural system and an intermediate complexity wing example is used to evaluate the relevance of evidence theory to an uncertainty quantification problem for the preliminary design of airframe structures.
Abstract: Over the past decade, classical probabilistic analysis has been a popular approach among the uncertainty quantification methods As the complexity and performance requirements of a structural system are increased, the quantification of uncertainty becomes more complicated, and various forms of uncertainties should be taken into consideration Because of the need to characterize the distribution of probability, classical probability theory may not be suitable for a large complex system such as an aircraft, in that our information is never complete because of lack of knowledge and statistical data Evidence theory, also known as Dempster-Shafer theory, is proposed to handle the epistemic uncertainty that stems from lack of knowledge about a structural system Evidence theory provides us with a useful tool for aleatory (random) and epistemic (subjective) uncertainties An intermediate complexity wing example is used to evaluate the relevance of evidence theory to an uncertainty quantification problem for the preliminary design of airframe structures Also, methods for efficient calculations in large-scale problems are discussed

92 citations


Journal ArticleDOI
TL;DR: In this paper, a nonparametric model of random uncertainties is proposed for dynamic substructuring, which does not require identifying uncertain parameters in the reduced matrix model of each substructure as is usually done for the parametric approach.
Abstract: This paper presents a new approach, called a nonparametric approach, for constructing a model of random uncertainties in dynamic substructuring in order to predict the matrix-valued frequency response functions of complex structures. Such an approach allows nonhomogeneous uncertainties to be modeled with the nonparametric approach. The Craig-Bampton dynamic substructuring method is used. For each substructure, a nonparametric model of random uncertainties is introduced. This nonparametric model does not require identifying uncertain parameters in the reduced matrix model of each substructure as is usually done for the parametric approach. This nonparametric model of random uncertainties is based on the use of a probability model for symmetric positive-definite real random matrices using the entropy optimization principle. The theory and a numerical example are presented in the context of the finite-element method. The numerical results obtained show the efficiency of the model proposed.

52 citations


Journal ArticleDOI
TL;DR: The use of surrogate models to improve the efficiency of prediction is presented in this article and the fuzzy-arithmetic-based method is suitable to estimate the possibility of failure.
Abstract: Based on the nature and extent of uncertainty existing in an engineering system, different approaches can be used for uncertainty propagation. If the uncertainty of the system is due to imprecise information, and lack of statistical data, the Possibilistic theory can be used. During preliminary design, uncertainties need to be accounted for and due to lack of sufficient information, assigning a probability distribution may not be possible. Moreover, the flight conditions (loads, control surface settings, etc.) during a mission could take values within certain bounds, which do not follow any pattern. The uncertain information in these cases is available as intervals with lower and upper limits. In this case, the fuzzy-arithmetic-based method is suitable to estimate the possibility of failure. The use of surrogate models to improve the efficiency of prediction is presented in this article. Various numerical examples are presented to demonstrate the applicability of the method to practical problems.

37 citations


01 Jan 2003
TL;DR: It is shown that the generalized polynomial chaos can be orders of magnitude more efficient than Monte Carlo simulations when the dimensionality of random input is low, e.g. for correlated noise.
Abstract: In this paper we review some applications of generalized polynomial chaos expansion for uncertainty quantification. The mathematical framework is presented and the convergence of the method is demonstrated for model problems. In particular, we solve the first-order and second-order ordinary differential equations with random parameters, and examine the efficiency of generalized polynomial chaos compared to Monte Carlo simulations. It is shown that the generalized polynomial chaos can be orders of magnitude more efficient than Monte Carlo simulations when the dimensionality of random input is low, e.g. for correlated noise.

26 citations


Book ChapterDOI
02 Jun 2003
TL;DR: In this paper, generalized polynomial chaos expansion for uncertainty quantification is presented and the convergence of the method is demonstrated for model problems, in particular for first-order and second-order ordinary differential equations with random parameters.
Abstract: In this paper we review some applications of generalized polynomial chaos expansion for uncertainty quantification. The mathematical framework is presented and the convergence of the method is demonstrated for model problems. In particular, we solve the first-order and second-order ordinary differential equations with random parameters, and examine the efficiency of generalized polynomial chaos compared to Monte Carlo simulations. It is shown that the generalized polynomial chaos can be orders of magnitude more efficient than Monte Carlo simulations when the dimensionality of random input is low, e.g. for correlated noise.

26 citations


Proceedings ArticleDOI
06 Jan 2003
TL;DR: An approach for assessing the uncertainties in simulation code outputs in which one focuses on the physics submodels incorporated into the code through a Bayesian analysis of a hierarchy of experiments that explore various aspects of theysics submodels.
Abstract: We present an approach for assessing the uncertainties in simulation code outputs in which one focuses on the physics submodels incorporated into the code Through a Bayesian analysis of a hierarchy of experiments that explore various aspects of the physics submodels, one can infer the sources of uncertainty, and quantify them As an example of this approach, we describe an effort to describe the plastic-flow characteristics of a high-strength steel by combining data from basic material tests with an analysis of Taylor impact experiments A thorough analysis of the material-characterization experiments is described, which necessarily includes the systematic uncertainties that arise form sample-to sample variations in the plastic behaviour of the specimens The Taylor experiments can only be understood by means of a simulation code We describe how this analysis can be done and how the results can be combined with the results of analyses of data from simpler materialcharacterization experiments

24 citations


Proceedings ArticleDOI
01 Jan 2003
TL;DR: In this article, a range method and a fuzzy sets approach are proposed to achieve designs that are robust to both epistemic uncertainty and random uncertainty, and the proposed models are combined with existing models for stochastic uncertainty in a two-step process.
Abstract: There are two sorts of uncertainty inherent in engineering design, random uncertainty and epistemic uncertainty. Random, or stochastic, uncertainty deals with the randomness or predictability of an event. It is well understood, easily modelled using classical probability, and ideal for such uncertainties as variations in manufacturing processes or material properties. Epistemic uncertainty deals with our lack of knowledge, our lack of information, and our own and others’ subjectivity concerning design parameters. While there are many methods to incorporate random uncertainty in a design process, there are fewer that consider epistemic uncertainty. There are fewer still that attempt to incorporate both sorts of uncertainty, and those that do usually attempt to model both sorts using the same uncertainty model. Two methods, a range method and a fuzzy sets approach, are proposed to achieve designs that are robust to both epistemic uncertainty and random uncertainty. Both methods incorporate preference aggregation methods to achieve more appropriate trade-offs between performance and variability when considering both sorts of uncertainty. The proposed models for epistemic uncertainty are combined with existing models for stochastic uncertainty in a two-step process. An illustrative example incorporating subjectivity concerning design parameters is presented.Copyright © 2003 by ASME

22 citations


Proceedings ArticleDOI
07 Apr 2003
TL;DR: This research focuses on how uncertainty can be quantified in multidisciplinary systems analysissubject to epistemic uncertainty associated with the disciplinary design tools and input parameters.
Abstract: In the last few decades, the exponential growth in computational performance capability have led to the development of large-scale simulation tools for design. Systems designed using such simulation tools can fail in service if the uncertainty of the simulation tool’s performance predictions is not accounted for. This research focuses on how uncertainty can be quantified in multidisciplinary systems analysissubject to epistemic uncertainty associated with the disciplinary design tools and input parameters. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. After the uncertainty has been quantified mathematically, the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such nonsmooth functions cannot be used in traditional gradientbased optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A formal trust region managed sequential approximate optimization approach is used to drive the optimization process. The trust region is managed by a trust region ratio based on the performance of the Lagrangian. The Lagrangian is a penalty function of the objective and the constraints. The methodology is illustrated in application to multidisciplinary test problems.

01 Jul 2003
TL;DR: In this paper, the authors reviewed the developments in the direction of best estimate approaches with uncertainty quantification and discussed the problems in practical applications of BEPU methods and indicated that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted and mature approach.
Abstract: In 1988 United States Nuclear Regulatory Commission approved the revised rule on the acceptance of emergency core cooling system (ECCS) performance. Since that there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. Several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to review the developments in the direction of best estimate approaches with uncertainty quantification and to discuss the problems in practical applications of BEPU methods. In general, the licensee methods are following original methods. The study indicated that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted and mature approach. (author)

Proceedings ArticleDOI
06 Jan 2003
TL;DR: It is described the critical nature of quantified Verification and Validation with Uncertainty Quantification at specified Confidence levels in evaluating system certification status and how this can lead to a Value Engineering methodology for investment strategy.
Abstract: This paper represents a summary of our methodology for Verification and Validation and Uncertainty Quantification. A graded scale methodology is presented and related to other concepts in the literature. We describe the critical nature of quantified Verification and Validation with Uncertainty Quantification at specified Confidence levels in evaluating system certification status. Only after Verification and Validation has contributed to Uncertainty Quantification at specified confidence can rational tradeoffs of various scenarios be made. Verification and Validation methods for various scenarios and issues are applied in assessments of Quantified Reliability at Confidence and we summarize briefly how this can lead to a Value Engineering methodology for investment strategy.


01 May 2003
TL;DR: A method where random variables comprising equivalence classes constrained by the available information are approximated using polynomial chaos expansions (PCEs) is proposed, which should be applicable to a broad range of engineering problems that are characterized by both irreducible and reducible uncertainty.
Abstract: Abstract Many safety assessments depend upon models that rely on probabilistic characterizations about which there is incomplete knowledge. For example, a system model may depend upon the time to failure of a piece of equipment for which no failures have actually been observed. The analysts in this case are faced with the task of developing a failure model for the equipment in question, having very limited knowledge about either the correct form of the failure distribution or the statistical parameters that characterize the distribution. They may assume that the process conforms to a Weibull or log-normal distribution or that it can be characterized by a particular mean or variance, but those assumptions impart more knowledge to the analysis than is actually available. To address this challenge, we propose a method where random variables comprising equivalence classes constrained by the available information are approximated using polynomial chaos expansions (PCEs). The PCE approximations are based on rigorous mathematical concepts developed from functional analysis and measure theory. The method has been codified in a computational tool, AVOCET, and has been applied successfully to example problems. Results indicate that it should be applicable to a broad range of engineering problems that are characterized by both irreducible andreducible uncertainty.

Book ChapterDOI
01 Jan 2003
TL;DR: The Dempster­­–Shafer theory, also known as the evidence theory, is proposed to handle limited data situations as an alternative to classical probability theory for the mathematical representation of uncertainty.
Abstract: Publisher Summary In this chapter, the Dempster­­–Shafer theory, also known as the evidence theory, is proposed to handle limited data situations as an alternative to classical probability theory for the mathematical representation of uncertainty. Uncertainty quantification in engineering systems has been performed by the popular framework of probability theory. Many scientific and engineering communities identified that there are limitations in using only one framework for quantifying the uncertainty experienced in specific practical applications. For example, in modem air vehicle structural design, the available information for some parameters may not be sufficient to use classical probability theory to quantify the uncertainty unless broad assumptions are added to make the given information complete. The representation of uncertainty in a structural system and a cost-effective methodology of uncertainty quantification are presented for an intermediate complexity wing representing a fighter aircraft.

Proceedings ArticleDOI
01 Jan 2003
TL;DR: In this paper, a nonparametric probabilistic model of random uncertainties which is adapted to the problematics of the blade mistuning is presented, which allows all the uncertainties yielding mistuning (manufacturing tolerances, dispersion of materials) to be taken into account and includes also the uncertainties due to the modeling errors.
Abstract: It is known that the forced response of mistuned bladed disks can strongly be amplified in comparison with the forced response of the tuned system. The random character of mistuning thus requires the construction of probabilistics models of random uncertainties. This paper presents a nonparametric probabilistic model of random uncertainties which is adapted to the problematics of the blade mistuning. This nonparametric approach allows all the uncertainties yielding mistuning (manufacturing tolerances, dispersion of materials) to be taken into account and includes also the uncertainties due to the modeling errors. This new probabilistic model takes into account both the mistuning of the blade eigenfrequencies and the blade modal shapes. The first point concerns the construction of this nonparametric approach in order to perform a mistuning analysis. The second part is devoted to the inverse problem associated with the manufacturing tolerances. A relationship between the manufacturing tolerances and the level of mistuning is also constructed.Copyright © 2003 by ASME

Book ChapterDOI
01 Jan 2003
TL;DR: It is shown that the polynomial chaos approach efficiently handles uncertainty propagation and establishes confidence intervals for the simulation results.
Abstract: The purpose of this work is to study the propagation of uncertainty in numerical simulations. For this purpose, we employ a methodology that couples the polynomial chaos technique with Karhunen-Loeve decomposition. This method is applied to the problem of a quasi- one-dimensional nozzle flow with uncertainty in inlet conditions and nozle shape. It is shown that the polynomial chaos approach efficiently handles uncertainty propagation and establishes confidence intervals for the simulation results. Furthermore, such an approach enables computation of statistical moments of arbitrary order in a much more effective way than other usual techniques such as the Monte Carlo method or the perturbation method.

Proceedings ArticleDOI
TL;DR: In this article, a review of recent applications of uncertainty quantification for airframe design and certification is presented, with the objective of highlighting promising research and transition applications, and several challenges and needs are explored to suggest future steps that must be completed to enable practical application of uncertainty-based quantification in the design and testing of airframes.
Abstract: Widespread interest in uncertainty quantification methods for airframe design and certification is driven by the desire to realize acquisition and operational cost savings through increased reliance on analysis, the goal being to design and produce more robust airframes. General sources of uncertainty are described that complicate airframe design and testing, thereby contributing to the high cost of airframes. Recent applications of uncertainty quantification are reviewed with the objective of highlighting promising research and transition applications. Finally, several challenges and needs are explored to suggest future steps that must be completed to enable practical application of uncertainty quantification in airframe design and certification.

Proceedings ArticleDOI
01 Jan 2003
TL;DR: A simplified notional finite element model of a gas turbine component is presented and it is shown that when the ranges of independent variables are large, kriging generally provides precision that is an order of magnitude better than RSE for the same DoE.
Abstract: Various sources of uncertainty greatly affect the life of structural components of gas turbines. Probabilistic approaches provide a means to evaluate these uncertainties; however, the accuracy of these approaches often remains unknown. Published quantitative studies of the effectiveness of various uncertainty quantification techniques are usually based on very simple examples. This is contrasted by the large-size finite element models that are used for complex geometries of critical structural parts such as turbine blades or nozzles. In such real-life applications the expenses of the “function calls” (runs of these models) preclude systematic studies of probabilistic methods. These expenses are attributed not only to the actual runs of the model, but to the difficulties in parametrically changing the model as well. Such a “complexity gap” leads to a justifiable concern over whether the trends identified in academic studies are relevant to these industrial applications. As a result, structural engineers end up with the number of function calls that they can afford rather than what would be needed for the required level of accuracy. The present effort intends to bridge this gap by studying a mid level problem: a simplified notional finite element model of a gas turbine component is presented. Despite its simplicity, the model is designed to reflect the major features of more realistic models. The parametric changes of the model are fully automated, which allows for performing an extensive set of benchmark tests that help to determine the relative merits of various existing probabilistic techniques for component life assessment. Several meta-modeling techniques are investigated and their performance compared based on direct sampling methods. In this context, various Design of Experiments (DoE) methods are studied. The results are used to construct the Response Surface Equations (RSE) as well as the kriging models. It is emphasized that changes in the relative locations of the critical points induced by variation of independent parameters can critically affect the overall fidelity of the modeling; the means of remedying such a degradation in precision are discussed. Finally, it is shown that when the ranges of independent variables are large, kriging generally provides precision that is an order of magnitude better than RSE for the same DoE.© 2003 ASME

Proceedings ArticleDOI
21 Sep 2003
TL;DR: In this study, inference on the input has to be made from the output, and Bayesian analysis is adopted to handle this inverse problem, then combine it with the forward simulation for prediction.
Abstract: Uncertainty quantification is essential in using numerical models for prediction. While many works focused on how the uncertainty of the inputs propagate to the outputs, the modeling errors of the numerical model were often overlooked. In our Bayesian framework, modeling errors play an essential role and were assessed through studying numerical solution errors. The main ideas and key concepts will be illustrated through an oil reservoir case study. In this study, inference on the input has to be made from the output. Bayesian analysis is adopted to handle this inverse problem, then combine it with the forward simulation for prediction. The solution error models were established based on the scale-up solutions and fine-grid solutions. As the central piece of our framework, the robustness of these error models is fundamental. In addition to the oil reservoir computer codes, we will also discuss the modelling of solution error of shock wave physics. Although the framework itself is simple, there is many statistical challenges which include optimal dimension of the error model, trade-off between sample size and the solution accuracy. These challenges are also discussed

ReportDOI
01 Oct 2003
TL;DR: This work develops techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and applies these constructions in computations of reacting flow using both intrusive and non-intrusive spectral PC techniques.
Abstract: Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.

Journal Article
TL;DR: In this paper, a method for inferring the material strength of existing structures based on small samples is proposed in terms of probability and statistics for both stochastic and epistemic uncertainty.
Abstract: In the analyses of the reliability of existing structures, the uncertainty of material strength includes both stochastic and epistemic uncertainty The latter will be taken into account in this paper According to the points in literature , a method inferring the material strength of existing structures statically, ie, the Bayes method based on small samples, is put forward in terms of probability and statistics For as much as both the information of samples and prior knowledge in design are taken into account, which makes the information more adequate, the method could produce a more rational and beneficial result

11 Aug 2003
TL;DR: In this article, a probabilistic model of random uncertainties for complex dynamical systems in the medium frequency (MF) range is presented, and the random energy matrix relative to a given MF band is studied.
Abstract: This paper presents a novel probabilistic model of random uncertainties for complex dynamical system in the medium frequency (MF) range. This approach combines a nonparametric probabilistic model of random uncertainties for the reduced matrix models in structural dynamics with a reduced matrix model method in the MF range. The theory is presented, the random energy matrix relative to a given MF band is studied and a simple numerical example is analyzed.

Proceedings ArticleDOI
07 Apr 2003
TL;DR: Investigation and illustrates formal procedures for assessing the uncertainty in the probability that a safety system will fail to operate as intended in an accident environment and suggests that evidence theory provides a potentially valuable representational tool for the display of the implications of significant epistemic uncertainty in inputs to complex analyses.
Abstract: Safety systems are important components of high consequence systems that are intended to prevent the unintended operation of the system and thus the potentially significant negative consequences that could result from such operation. This presentation investigates and illustrates formal procedures for assessing the uncertainty in the probability that a safety system will fail to operate as intended in an accident environment. Probability theory and evidence theory are introduced as possible mathematical structures for the representation of the epistemic uncertainty associated with the performance of safety systems, and a representation of this type is illustrated with a hypothetical safety system involving one weak link and one strong link that is exposed to a high temperature fire environment. Topics considered include: (i) the nature of diffuse uncertainty information involving a system and its environment, (ii) the conversion of diffuse uncertainty information into the mathematical structures associated with probability theory and evidence theory, and (iii) the propagation of these uncertainty structures through a model for a safety system to obtain representations in the context of probability theory and evidence theory of the uncertainty in the probability that the safety system will fail to operate as intended. The results suggest that evidence theory provides a potentially valuable representational tool for the display of the implications of significant epistemic uncertainty in inputs to complex analyses. *†‡

01 Jan 2003
TL;DR: In this paper, the generalized polynomial chaos expansion for uncertainty quantification is presented and the convergence of the method is demonstrated for model problems, in particular the first-order and second-order ordinary differential equations with random parame- ters.
Abstract: In this paper we review some applications of generalized polynomial chaos expansion for uncertainty quantification. The math- ematical framework is presented and the convergence of the method is demonstrated for model problems. In particular, we solve the first-order and second-order ordinary di erential equations with random parame- ters, and examine the e ciency of generalized polynomial chaos com- pared to Monte Carlo simulations. It is shown that the generalized poly- nomial chaos can be orders of magnitude more e cient than Monte Carlo simulations when the dimensionality of random input is low, e.g. for cor- related noise.

Proceedings ArticleDOI
01 Jan 2003
TL;DR: In this article, uncertainty quantification is applied to the modeling of fluid flow in an electrokinetically driven microchannel, allowing for detailed buffer electrochemistry and finite rate analyte reactions.
Abstract: Uncertainty quantification (UQ) in models of physical systems is a necessary tool for both model validation and engineering design optimization. We have applied UQ tools using stochastic spectral polynomial chaos techniques to the modeling of fluid flow in an electrokinetically driven microchannel, allowing for detailed buffer electrochemistry and finite rate analyte reactions. The model includes full coupling of wall electric double layer potential with variations in PH and local electric field. Allowing for uncertainties in species mobilities, buffer equilibrium constants, and wall properties, we have computed the resulting uncertainty in predicted model outputs, illustrating the impact of growth of uncertainty on confidence in model predictions. We present details of the computational UQ techniques with specific focus on their application in the electrochemical micro fluidic context. We also present UQ results pertaining to model protein labeling in an electokinetically-pumped microchannel flow.Copyright © 2003 by ASME

Proceedings ArticleDOI
21 Sep 2003
TL;DR: In this article, the authors presented the first attempt of applying the extended Kalman filter (EKF) method of data assimilation to shock-wave dynamics induced by a high-speed impact.
Abstract: Model assimilation of data strives to determine optimally the state of an evolving physical system from a limited number of observations. The present study represents the first attempt of applying the extended Kalman filter (EKF) method of data assimilation to shock-wave dynamics induced by a high-speed impact. EKF solves the full nonlinear state evolution and estimates its associated error-covariance matrix in time. The state variables obtained by the blending of past model evolution with currently available data, along with their associated minimized errors (or uncertainties), are then used as initial conditions for further prediction until the next time at which data becomes available. In this study, a one-dimensional (1-D) finite-difference code is used along with data measured from a 1-D flyer plate experiment. The results demonstrate that the EKF assimilation of a limited amount of pressure data, measured at the middle of the target plate alone, helps track the evolution of all the state variables with reduced errors

01 Mar 2003
TL;DR: The use of surrogate models to improve the efficiency of prediction is presented in this paper and various numerical examples are presented to demonstrate the applicability of the method to practical problems.
Abstract: : Multiple configurations in various stages of aircraft design have to be experimentally tested and validated to study the performance of various systems subjected to non-deterministic design parameters. These tests are expensive and time consuming, increasing the acquisition cost and time for military aircraft/equipment. Therefore, analytical certification aims at reducing/eliminating the expensive prototype testing during these intermediate design stages by propagating the input variance through the design. Analytical certification involves modeling the variance/uncertainties in the design parameters and estimating the variance in the component/system performance. Based on the nature and extent of uncertainty existing in an engineering system, different approaches can be used for uncertainty propagation. If the uncertainty of the system is due to imprecise information and lack of statistical data, the Possibilistic theory can be used. During preliminary design, uncertainties need to be accounted for and due to lack of sufficient information assigning a probability distribution may not be possible. Moreover, the flight conditions (loads, control surface settings, etc.) during a mission could take values within certain bounds, which do not follow any particular pattern. The uncertain information in these cases is available as intervals with lower and upper limits. In this case, the fuzzy arithmetic based method is suitable to estimate the possibility of failure. The use of surrogate models to improve the efficiency of prediction is presented in this paper. Various numerical examples are presented to demonstrate the applicability of the method to practical problems.

Book ChapterDOI
01 Jan 2003
TL;DR: In this article, a technique of uncertainty quantification through spectral projection methods is used to examine the propagation of parametric uncertainty through 0D combustion chemistry and 1D premixed flame, with a focus on higher-order information and the correlations among parameters.
Abstract: Publisher Summary This chapter illustrates the nonintrusive analysis of parametric uncertainty. Quantification of spectral modes adds a new level of understanding, identifying the significance of individual uncertain parameters and determining where within the evolution of the system each parameter has the most influence. In this study, a technique of uncertainty quantification through spectral projection methods is used to examine the propagation of parametric uncertainty through 0D combustion chemistry and 1D premixed flame, with a focus on higher-order information and the correlations among parameters. Compared to conventional Monte Carlo analysis, this method quantifies the extent, dependence, and propagation of uncertainty through the model system and allows correlation of uncertainties in specific parameters to the resulting total uncertainty in product concentrations and flame structure. Analysis of 0D and 1D hydrogen-oxygen combustion demonstrates that known empirical uncertainties in model parameters may result in large uncertainties in the final output, and nonlinearities in system response behavior make the order of the analysis an important consideration. Furthermore, this analysis is readily extendable to multiple dimensions and greater numbers of uncertain parameters, while preserving the integrity of the realization engine through the nonintrusive operation of the method.