scispace - formally typeset
Search or ask a question
Author

P.E. James T. P. Yao

Bio: P.E. James T. P. Yao is an academic researcher. The author has contributed to research in topics: Probabilistic design & Applied probability. The author has an hindex of 1, co-authored 1 publications receiving 417 citations.

Papers
More filters

Cited by
More filters
Journal ArticleDOI
TL;DR: Practical guidelines for verification and validation of NMS models and simulations are established that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies.
Abstract: Computational modeling and simulation of neuromusculoskeletal (NMS) systems enables researchers and clinicians to study the complex dynamics underlying human and animal movement. NMS models use equations derived from physical laws and biology to help solve challenging real-world problems, from designing prosthetics that maximize running speed to developing exoskeletal devices that enable walking after a stroke. NMS modeling and simulation has proliferated in the biomechanics research community over the past 25 years, but the lack of verification and validation standards remains a major barrier to wider adoption and impact. The goal of this paper is to establish practical guidelines for verification and validation of NMS models and simulations that researchers, clinicians, reviewers, and others can adopt to evaluate the accuracy and credibility of modeling studies. In particular, we review a general process for verification and validation applied to NMS models and simulations, including careful formulation of a research question and methods, traditional verification and validation steps, and documentation and sharing of results for use and testing by other researchers. Modeling the NMS system and simulating its motion involves methods to represent neural control, musculoskeletal geometry, muscle-tendon dynamics, contact forces, and multibody dynamics. For each of these components, we review modeling choices and software verification guidelines; discuss variability, errors, uncertainty, and sensitivity relationships; and provide recommendations for verification and validation by comparing experimental data and testing robustness. We present a series of case studies to illustrate key principles. In closing, we discuss challenges the community must overcome to ensure that modeling and simulation are successfully used to solve the broad spectrum of problems that limit human mobility.

479 citations

Journal ArticleDOI
TL;DR: An iterative strategy to build designs of experiments is proposed, which is based on an explicit trade-off between reduction of global uncertainty and exploration of the regions of interest, which shows that a substantial reduction of error can be achieved in the crucial regions.
Abstract: This paper addresses the issue of designing experiments for a metamodel that needs to be accurate for a certain level of the response value. Such situation is encountered in particular in constrained optimization and reliability analysis. Here, we propose an iterative strategy to build designs of experiments, which is based on an explicit trade-off between reduction of global uncertainty and exploration of the regions of interest. The method is illustrated on several test-problems. It is shown that a substantial reduction of error can be achieved in the crucial regions, with reasonable loss on the global accuracy. The method is finally applied to a reliability analysis problem; it is found that the adaptive designs significantly outperform classical space-filling designs.

294 citations

Journal ArticleDOI
TL;DR: This article presents a methodology to generate explicit decision functions using support vector machines (SVM) and proposes an adaptive sampling scheme that updates the decision function.

203 citations

Journal ArticleDOI
TL;DR: The methodology includes uncertainty in the experimental measurement, and the posterior and prior distributions of the model output are used to compute a validation metric based on Bayesian hypothesis testing.

197 citations

Journal ArticleDOI
TL;DR: In this article, an adaptive probability analysis method is proposed to generate the probability distribution of the output performance function by identifying the propagation of input uncertainty to output uncertainty, which is based on an enhanced hybrid mean value (HMV+) analysis in the performance measure approach.
Abstract: This paper proposes an adaptive probability analysis method that can effectively generate the probability distribution of the output performance function by identifying the propagation of input uncertainty to output uncertainty. The method is based on an enhanced hybrid mean value (HMV+) analysis in the performance measure approach (PMA) for numerical stability and efficiency in search of the most probable point (MPP). The HMV+ method improves numerical stability and efficiency especially for highly nonlinear output performance functions by providing steady convergent behavior in the MPP search. The proposed adaptive probability analysis method approximates the MPP locus, and then adaptively refines this locus using an a posteriori error estimator. Using the fact that probability levels can be easily set a priori in PMA, the MPP locus is approximated using the interpolated moving least-squares method. For refinement of the approximated MPP locus, additional probability levels are adaptively determined through an a posteriori error estimator. The adaptive probability analysis method will determine the minimum number of necessary probability levels, while ensuring accuracy of the approximated MPP locus. Several examples are used to show the effectiveness of the proposed adaptive probability analysis method using the enhanced HMV+ method.

193 citations