scispace - formally typeset
Search or ask a question
Author

John McFarland

Bio: John McFarland is an academic researcher from Vanderbilt University. The author has contributed to research in topics: Gaussian process & Calibration (statistics). The author has an hindex of 7, co-authored 7 publications receiving 876 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: This paper develops an efficient reliability analysis method that accurately characterizes the limit state throughout the random variable space and is both accurate for any arbitrarily shaped limit state and computationally efficient even for expensive response functions.
Abstract: Many engineering applications are characterized by implicit response functions that are expensive to evaluate and sometimes nonlinear in their behavior, making reliability analysis difficult. This paper develops an efficient reliability analysis method that accurately characterizes the limit state throughout the random variable space. The method begins with a Gaussian process model built from a very small number of samples, and then adaptively chooses where to generate subsequent samples to ensure that the model is accurate in the vicinity of the limit state. The resulting Gaussian process model is then sampled using multimodal adaptive importance sampling to calculate the probability of exceeding (or failing to exceed) the response level of interest. By locating multiple points on or near the limit state, more complex and nonlinear limit states can be modeled, leading to more accurate probability integration. By concentrating the samples in the area where accuracy is important (i.e., in the vicinity of the limit state), only a small number of true function evaluations are required to build a quality surrogate model. The resulting method is both accurate for any arbitrarily shaped limit state and computationally efficient even for expensive response functions. This new method is applied to a collection of example problems including one that analyzes the reliability of a microelectromechanical system device that current available methods have difficulty solving either accurately or efficiently.

804 citations

Journal ArticleDOI
TL;DR: In this paper, the authors describe an initial investigation into response surface based uncertainty quantification using both kriging and multivariate adaptive regression spline surface approximation methods, and the impact of two different data sampling methods, Latin hypercube sampling and orthogonal array sampling, is also examined.
Abstract: Conventional sampling-based uncertainty quantification (UQ) methods involve generating large numbers of random samples on input variables and calculating output statistics by evaluating the computational model for each set of samples. For real world applications, this method can be computationally prohibitive due to the cost of the model and the time required for each simulation run. Using response surface approximations may allow for the output statistics to be estimated more accurately when only a limited number of simulation runs are available. This paper describes an initial investigation into response surface based UQ using both kriging and multivariate adaptive regression spline surface approximation methods. In addition, the impact of two different data sampling methods, Latin hypercube sampling and orthogonal array sampling, is also examined. The data obtained from this study indicate that caution should be exercised when implementing response surface based methods for UQ using very low sample siz...

71 citations

Journal ArticleDOI
TL;DR: In this paper, the authors address the validation and calibration of computer simulations using the thermal challenge problem developed at Sandia National Laboratories for illustration, and illustrate the use of Hotelling's T2 statistic for multivariate significance testing, with emphasis on the formulation and interpretation of such an analysis for validation assessment.

68 citations

Proceedings ArticleDOI
23 Apr 2007
TL;DR: The application of ecient global optimization to reliability assessment is described to provide a method that eciently characterizes the limit state throughout the uncertain space and is both accurate for any arbitrarily shaped limit state and computationally ecient even for expensive response functions.
Abstract: As engineering applications become increasingly complex, they are often characterized by implicit response functions that are both expensive to evaluate and nonlinear in their behavior. Reliability assessment given this type of response is dicult with available methods. Current reliability methods focus on the discovery of a single most probable point of failure, and then build a low-order approximation to the limit state at this point. This creates inaccuracies when applied to engineering applications for which the limit state has a higher degree of nonlinearity or is multimodal. Sampling methods, on the other hand, do not rely on an approximation to the shape of the limit state and are therefore generally more accurate when applied to problems with nonlinear limit states. However, sampling methods typically require a large number of response function evaluations, which can make their application infeasible for computationally expensive problems. This paper describes the application of ecient global optimization to reliability assessment to provide a method that eciently characterizes the limit state throughout the uncertain space. The method begins with a Gaussian process model built from a very small number of samples, and then intelligently chooses where to generate subsequent samples to ensure the model is accurate in the vicinity of the limit state. The resulting Gaussian process model is then sampled using multimodal adaptive importance sampling to calculate the probability of exceeding (or failing to exceed) the response level of interest. By locating multiple points on or near the limit state, more complex limit states can be modeled, leading to more accurate probability integration. By concentrating the samples in the area where accuracy is important (i.e. in the vicinity of the limit state), only a small number of true function evaluations are required to build a quality surrogate model. The resulting method is both accurate for any arbitrarily shaped limit state and computationally ecient even for expensive response functions. This new method is applied to a collection of example problems that currently available methods have diculty

50 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: An iterative approach based on Monte Carlo Simulation and Kriging metamodel to assess the reliability of structures in a more efficient way and is shown to be very efficient as the probability of failure obtained with AK-MCS is very accurate and this, for only a small number of calls to the performance function.

1,234 citations

Book
22 Nov 2010
TL;DR: A comprehensive and systematic development of the basic concepts, principles, and procedures for verification and validation of models and simulations that are described by partial differential and integral equations and the simulations that result from their numerical solution.
Abstract: Advances in scientific computing have made modelling and simulation an important part of the decision-making process in engineering, science, and public policy. This book provides a comprehensive and systematic development of the basic concepts, principles, and procedures for verification and validation of models and simulations. The emphasis is placed on models that are described by partial differential and integral equations and the simulations that result from their numerical solution. The methods described can be applied to a wide range of technical fields, from the physical sciences, engineering and technology and industry, through to environmental regulations and safety, product and plant safety, financial investing, and governmental regulations. This book will be genuinely welcomed by researchers, practitioners, and decision makers in a broad range of fields, who seek to improve the credibility and reliability of simulation results. It will also be appropriate either for university courses or for independent study.

966 citations

Journal ArticleDOI
TL;DR: This paper develops an efficient reliability analysis method that accurately characterizes the limit state throughout the random variable space and is both accurate for any arbitrarily shaped limit state and computationally efficient even for expensive response functions.
Abstract: Many engineering applications are characterized by implicit response functions that are expensive to evaluate and sometimes nonlinear in their behavior, making reliability analysis difficult. This paper develops an efficient reliability analysis method that accurately characterizes the limit state throughout the random variable space. The method begins with a Gaussian process model built from a very small number of samples, and then adaptively chooses where to generate subsequent samples to ensure that the model is accurate in the vicinity of the limit state. The resulting Gaussian process model is then sampled using multimodal adaptive importance sampling to calculate the probability of exceeding (or failing to exceed) the response level of interest. By locating multiple points on or near the limit state, more complex and nonlinear limit states can be modeled, leading to more accurate probability integration. By concentrating the samples in the area where accuracy is important (i.e., in the vicinity of the limit state), only a small number of true function evaluations are required to build a quality surrogate model. The resulting method is both accurate for any arbitrarily shaped limit state and computationally efficient even for expensive response functions. This new method is applied to a collection of example problems including one that analyzes the reliability of a microelectromechanical system device that current available methods have difficulty solving either accurately or efficiently.

804 citations

ReportDOI
01 May 2010
TL;DR: This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.
Abstract: The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications. DAKOTA Version 5.0 Reference Manual generated on May 7, 2010

757 citations

Journal ArticleDOI
TL;DR: An overview of a comprehensive framework is given for estimating the predictive uncertainty of scientific computing applications, which treats both types of uncertainty (aleatory and epistemic), incorporates uncertainty due to the mathematical form of the model, and provides a procedure for including estimates of numerical error in the Predictive uncertainty.

649 citations