scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Probability, Reliability and Statistical Methods in Engineering Design

01 Jan 2001-Journal of Structural Engineering-asce (American Society of Civil Engineers)-Vol. 127, Iss: 1, pp 101-101
About: This article is published in Journal of Structural Engineering-asce.The article was published on 2001-01-01. It has received 451 citations till now. The article focuses on the topics: Probabilistic design & Reliability (statistics).
Citations
More filters
Journal ArticleDOI
TL;DR: Practical guidelines for verification and validation of NMS models and simulations are established that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies.
Abstract: Computational modeling and simulation of neuromusculoskeletal (NMS) systems enables researchers and clinicians to study the complex dynamics underlying human and animal movement. NMS models use equations derived from physical laws and biology to help solve challenging real-world problems, from designing prosthetics that maximize running speed to developing exoskeletal devices that enable walking after a stroke. NMS modeling and simulation has proliferated in the biomechanics research community over the past 25 years, but the lack of verification and validation standards remains a major barrier to wider adoption and impact. The goal of this paper is to establish practical guidelines for verification and validation of NMS models and simulations that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies. In particular, we review a general process for verification and validation applied to NMS models and simulations, including careful formulation of a research question and methods, traditional verification and validation steps, and documentation and sharing of results for use and testing by other researchers. Modeling the NMS system and simulating its motion involves methods to represent neural control, musculoskeletal geometry, muscle-tendon dynamics, contact forces, and multibody dynamics. For each of these components, we review modeling choices and software verification guidelines; discuss variability, errors, uncertainty, and sensitivity relationships; and provide recommendations for verification and validation by comparing experimental data and testing robustness. We present a series of case studies to illustrate key principles. In closing, we discuss challenges the community must overcome to ensure that modeling and simulation are successfully used to solve the broad spectrum of problems that limit human mobility.

479 citations


Cites background from "Probability, Reliability and Statis..."

  • ...Several texts and review papers provide overviews of approaches to sensitivity analysis for engineering applications [25,26], including biomechanics [19]....

    [...]

Journal ArticleDOI
TL;DR: An iterative strategy to build designs of experiments is proposed, which is based on an explicit trade-off between reduction of global uncertainty and exploration of the regions of interest, which shows that a substantial reduction of error can be achieved in the crucial regions.
Abstract: This paper addresses the issue of designing experiments for a metamodel that needs to be accurate for a certain level of the response value. Such situation is encountered in particular in constrained optimization and reliability analysis. Here, we propose an iterative strategy to build designs of experiments, which is based on an explicit trade-off between reduction of global uncertainty and exploration of the regions of interest. The method is illustrated on several test-problems. It is shown that a substantial reduction of error can be achieved in the crucial regions, with reasonable loss on the global accuracy. The method is finally applied to a reliability analysis problem; it is found that the adaptive designs significantly outperform classical space-filling designs.

294 citations


Cites methods from "Probability, Reliability and Statis..."

  • ...There are many methods for calculating the failure probability of a system [22-24]....

    [...]

Journal ArticleDOI
TL;DR: This article presents a methodology to generate explicit decision functions using support vector machines (SVM) and proposes an adaptive sampling scheme that updates the decision function.

203 citations


Cites methods from "Probability, Reliability and Statis..."

  • ...When considering reliability, discontinuities might also hamper the use of approximation methods such as first and second order reliability methods (FORM and SORM) [9], advanced mean value (AMV) [10], or Monte-Carlo simulations with response surfaces....

    [...]

Journal ArticleDOI
TL;DR: The methodology includes uncertainty in the experimental measurement, and the posterior and prior distributions of the model output are used to compute a validation metric based on Bayesian hypothesis testing.

197 citations


Cites background from "Probability, Reliability and Statis..."

  • ...finite element models) or response surface approximations, for a large variety of engineering problems [16]....

    [...]

Journal ArticleDOI
TL;DR: In this article, an adaptive probability analysis method is proposed to generate the probability distribution of the output performance function by identifying the propagation of input uncertainty to output uncertainty, which is based on an enhanced hybrid mean value (HMV+) analysis in the performance measure approach.
Abstract: This paper proposes an adaptive probability analysis method that can effectively generate the probability distribution of the output performance function by identifying the propagation of input uncertainty to output uncertainty. The method is based on an enhanced hybrid mean value (HMV+) analysis in the performance measure approach (PMA) for numerical stability and efficiency in search of the most probable point (MPP). The HMV+ method improves numerical stability and efficiency especially for highly nonlinear output performance functions by providing steady convergent behavior in the MPP search. The proposed adaptive probability analysis method approximates the MPP locus, and then adaptively refines this locus using an a posteriori error estimator. Using the fact that probability levels can be easily set a priori in PMA, the MPP locus is approximated using the interpolated moving least-squares method. For refinement of the approximated MPP locus, additional probability levels are adaptively determined through an a posteriori error estimator. The adaptive probability analysis method will determine the minimum number of necessary probability levels, while ensuring accuracy of the approximated MPP locus. Several examples are used to show the effectiveness of the proposed adaptive probability analysis method using the enhanced HMV+ method.

193 citations


Cites background or methods from "Probability, Reliability and Statis..."

  • ...Note that a relatively larger error of the adaptive probability analysis occurs near 10% and 90%, which is mainly due to the error of FORM....

    [...]

  • ...Note that a relatively larger amount of error of the adaptive probability analysis occurs between 50% and 95%, which is again mainly due to the error of FORM....

    [...]

  • ...The probability distribution function is obtained by identifying the propagation of input uncertainty to output 3 uncertainty as (Madsen et al. 1986; Haldar and Mahadevan 2000) FG(g) = ∞∫ −∞ · · · ∞∫ −∞ fX(x)dx1 . . . dxn (1) where the uncertainty FX(x) (or fX(x)) of input X is propagated to the…...

    [...]

  • ...Using FORM, (2) or (3) can be solved by formulating an optimization with one equality constraint (Tu and Choi 1999; Youn et al. 2001, 2003; Lee and Kwak 1987– 88) to obtain gi = g(u ∗ i ), if βi > 0 , maximize G(U) subject to ‖U‖= βi , for i= 1, . . . ,NPL if βi < 0 , minimize G(U) subject to ‖U‖= βi , for i= 1, . . . ,NPL (5) TS b Please check quality of all your figures send higher resolution versions of Figs....

    [...]

  • ...This is referred to as the first-order reliability method (FORM) (Hasofer and Lind 1974; Madsen et al. 1986; Haldar and Mahadevan 2000), as shown in Fig....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: Practical guidelines for verification and validation of NMS models and simulations are established that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies.
Abstract: Computational modeling and simulation of neuromusculoskeletal (NMS) systems enables researchers and clinicians to study the complex dynamics underlying human and animal movement. NMS models use equations derived from physical laws and biology to help solve challenging real-world problems, from designing prosthetics that maximize running speed to developing exoskeletal devices that enable walking after a stroke. NMS modeling and simulation has proliferated in the biomechanics research community over the past 25 years, but the lack of verification and validation standards remains a major barrier to wider adoption and impact. The goal of this paper is to establish practical guidelines for verification and validation of NMS models and simulations that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies. In particular, we review a general process for verification and validation applied to NMS models and simulations, including careful formulation of a research question and methods, traditional verification and validation steps, and documentation and sharing of results for use and testing by other researchers. Modeling the NMS system and simulating its motion involves methods to represent neural control, musculoskeletal geometry, muscle-tendon dynamics, contact forces, and multibody dynamics. For each of these components, we review modeling choices and software verification guidelines; discuss variability, errors, uncertainty, and sensitivity relationships; and provide recommendations for verification and validation by comparing experimental data and testing robustness. We present a series of case studies to illustrate key principles. In closing, we discuss challenges the community must overcome to ensure that modeling and simulation are successfully used to solve the broad spectrum of problems that limit human mobility.

479 citations

Journal ArticleDOI
TL;DR: An iterative strategy to build designs of experiments is proposed, which is based on an explicit trade-off between reduction of global uncertainty and exploration of the regions of interest, which shows that a substantial reduction of error can be achieved in the crucial regions.
Abstract: This paper addresses the issue of designing experiments for a metamodel that needs to be accurate for a certain level of the response value. Such situation is encountered in particular in constrained optimization and reliability analysis. Here, we propose an iterative strategy to build designs of experiments, which is based on an explicit trade-off between reduction of global uncertainty and exploration of the regions of interest. The method is illustrated on several test-problems. It is shown that a substantial reduction of error can be achieved in the crucial regions, with reasonable loss on the global accuracy. The method is finally applied to a reliability analysis problem; it is found that the adaptive designs significantly outperform classical space-filling designs.

294 citations

Journal ArticleDOI
TL;DR: This article presents a methodology to generate explicit decision functions using support vector machines (SVM) and proposes an adaptive sampling scheme that updates the decision function.

203 citations

Journal ArticleDOI
TL;DR: The methodology includes uncertainty in the experimental measurement, and the posterior and prior distributions of the model output are used to compute a validation metric based on Bayesian hypothesis testing.

197 citations

Journal ArticleDOI
TL;DR: In this article, an adaptive probability analysis method is proposed to generate the probability distribution of the output performance function by identifying the propagation of input uncertainty to output uncertainty, which is based on an enhanced hybrid mean value (HMV+) analysis in the performance measure approach.
Abstract: This paper proposes an adaptive probability analysis method that can effectively generate the probability distribution of the output performance function by identifying the propagation of input uncertainty to output uncertainty. The method is based on an enhanced hybrid mean value (HMV+) analysis in the performance measure approach (PMA) for numerical stability and efficiency in search of the most probable point (MPP). The HMV+ method improves numerical stability and efficiency especially for highly nonlinear output performance functions by providing steady convergent behavior in the MPP search. The proposed adaptive probability analysis method approximates the MPP locus, and then adaptively refines this locus using an a posteriori error estimator. Using the fact that probability levels can be easily set a priori in PMA, the MPP locus is approximated using the interpolated moving least-squares method. For refinement of the approximated MPP locus, additional probability levels are adaptively determined through an a posteriori error estimator. The adaptive probability analysis method will determine the minimum number of necessary probability levels, while ensuring accuracy of the approximated MPP locus. Several examples are used to show the effectiveness of the proposed adaptive probability analysis method using the enhanced HMV+ method.

193 citations