scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Sequential Experiment Design for Contour Estimation From Complex Computer Codes

01 Nov 2008-Technometrics (Taylor & Francis)-Vol. 50, Iss: 4, pp 527-541
TL;DR: A sequential methodology for estimating a contour from a complex computer code using a stochastic process model as a surrogate for the computer simulator is developed and applied to exploration of a contours for a network queuing system.
Abstract: Computer simulation often is used to study complex physical and engineering processes. Although a computer simulator often can be viewed as an inexpensive way to gain insight into a system, it still can be computationally costly. Much of the recent work on the design and analysis of computer experiments has focused on scenarios where the goal is to fit a response surface or process optimization. In this article we develop a sequential methodology for estimating a contour from a complex computer code. The approach uses a stochastic process model as a surrogate for the computer simulator. The surrogate model and associated uncertainty are key components in a new criterion used to identify the computer trials aimed specifically at improving the contour estimate. The proposed approach is applied to exploration of a contour for a network queuing system. Issues related to practical implementation of the proposed approach also are addressed.
Citations
More filters
Journal ArticleDOI
TL;DR: An iterative approach based on Monte Carlo Simulation and Kriging metamodel to assess the reliability of structures in a more efficient way and is shown to be very efficient as the probability of failure obtained with AK-MCS is very accurate and this, for only a small number of calls to the performance function.

1,234 citations

Journal ArticleDOI
TL;DR: This paper develops an efficient reliability analysis method that accurately characterizes the limit state throughout the random variable space and is both accurate for any arbitrarily shaped limit state and computationally efficient even for expensive response functions.
Abstract: Many engineering applications are characterized by implicit response functions that are expensive to evaluate and sometimes nonlinear in their behavior, making reliability analysis difficult. This paper develops an efficient reliability analysis method that accurately characterizes the limit state throughout the random variable space. The method begins with a Gaussian process model built from a very small number of samples, and then adaptively chooses where to generate subsequent samples to ensure that the model is accurate in the vicinity of the limit state. The resulting Gaussian process model is then sampled using multimodal adaptive importance sampling to calculate the probability of exceeding (or failing to exceed) the response level of interest. By locating multiple points on or near the limit state, more complex and nonlinear limit states can be modeled, leading to more accurate probability integration. By concentrating the samples in the area where accuracy is important (i.e., in the vicinity of the limit state), only a small number of true function evaluations are required to build a quality surrogate model. The resulting method is both accurate for any arbitrarily shaped limit state and computationally efficient even for expensive response functions. This new method is applied to a collection of example problems including one that analyzes the reliability of a microelectromechanical system device that current available methods have difficulty solving either accurately or efficiently.

804 citations


Cites methods from "Sequential Experiment Design for Co..."

  • ...Inspired by the contour estimation work in [28], this expectation can be calculated in a similar fashion as Eq....

    [...]

Journal ArticleDOI
TL;DR: An original and easily implementable method called AK-IS for active learning and Kriging-based Importance Sampling, based on the AK-MCS algorithm, that enables the correction or validation of the FORM approximation with only a very few mechanical model computations.

458 citations


Cites methods from "Sequential Experiment Design for Co..."

  • ...Inspired by EGO [19] and the Kriging contour estimation method [25], EGRA [5] (Efficient Global Reliability Analysis) is based on a learning function called the Expected Feasibility Function (EFF) which provides an indication of how well the true value of the performance function in a sample can be expected to satisfy the constraint HðuÞ 1⁄4 0....

    [...]

  • ...AK-MCS differs in two ways with the other Kriging-based reliability methods from the literature [5,25] which use an optimisation method to search for the best next sample....

    [...]

Journal ArticleDOI
TL;DR: The rapidly developing field of simulation-based inference is reviewed and the forces giving additional momentum to the field are identified to describe how the frontier is expanding.
Abstract: Many domains of science have developed complex simulations to describe phenomena of interest. While these simulations provide high-fidelity models, they are poorly suited for inference and lead to challenging inverse problems. We review the rapidly developing field of simulation-based inference and identify the forces giving additional momentum to the field. Finally, we describe how the frontier is expanding so that a broad audience can appreciate the profound influence these developments may have on science.

427 citations

Journal ArticleDOI
TL;DR: The aim of the present paper is to develop a strategy for solving reliability-based design optimization (RBDO) problems that remains applicable when the performance models are expensive to evaluate.
Abstract: The aim of the present paper is to develop a strategy for solving reliability-based design optimization (RBDO) problems that remains applicable when the performance models are expensive to evaluate. Starting with the premise that simulation-based approaches are not affordable for such problems, and that the most-probable-failure-point-based approaches do not permit to quantify the error on the estimation of the failure probability, an approach based on both metamodels and advanced simulation techniques is explored. The kriging metamodeling technique is chosen in order to surrogate the performance functions because it allows one to genuinely quantify the surrogate error. The surrogate error onto the limit-state surfaces is propagated to the failure probabilities estimates in order to provide an empirical error measure. This error is then sequentially reduced by means of a population-based adaptive refinement technique until the kriging surrogates are accurate enough for reliability analysis. This original refinement strategy makes it possible to add several observations in the design of experiments at the same time. Reliability and reliability sensitivity analyses are performed by means of the subset simulation technique for the sake of numerical efficiency. The adaptive surrogate-based strategy for reliability estimation is finally involved into a classical gradient-based optimization algorithm in order to solve the RBDO problem. The kriging surrogates are built in a so-called augmented reliability space thus making them reusable from one nested RBDO iteration to the other. The strategy is compared to other approaches available in the literature on three academic examples in the field of structural mechanics.

354 citations


Cites background or methods from "Sequential Experiment Design for Co..."

  • ...Various refinement techniques have been proposed in the kriging-related literature, e.g. for global optimization (Jones et al., 1998) or for probability/quantile estimation (Oakley, 2004; Bichon et al., 2008; Lee and Jung, 2008; Ranjan et al., 2008; Vazquez and Bect, 2009; Picheny et al., 2010)....

    [...]

  • ...for global optimization (Jones et al. 1998) or for probability/quantile estimation (Oakley 2004; Bichon et al. 2008; Lee and Jung 2008; Ranjan et al. 2008; Vazquez and Bect 2009; Picheny et al. 2010)....

    [...]

References
More filters
Book
01 Jan 1975
TL;DR: Names of founding work in the area of Adaptation and modiication, which aims to mimic biological optimization, and some (Non-GA) branches of AI.
Abstract: Name of founding work in the area. Adaptation is key to survival and evolution. Evolution implicitly optimizes organisims. AI wants to mimic biological optimization { Survival of the ttest { Exploration and exploitation { Niche nding { Robust across changing environments (Mammals v. Dinos) { Self-regulation,-repair and-reproduction 2 Artiicial Inteligence Some deenitions { "Making computers do what they do in the movies" { "Making computers do what humans (currently) do best" { "Giving computers common sense; letting them make simple deci-sions" (do as I want, not what I say) { "Anything too new to be pidgeonholed" Adaptation and modiication is root of intelligence Some (Non-GA) branches of AI: { Expert Systems (Rule based deduction)

32,573 citations


"Sequential Experiment Design for Co..." refers methods in this paper

  • ...To identify the starting points for the optimization routines, a genetic algorithm is used (e.g., Holland 1975; Mandal, Wu, and Johnson 2006) with the following specifications: • Initial population....

    [...]

Book
01 Jan 1966
TL;DR: In this paper, the Riesz representation theorem is used to describe the regularity properties of Borel measures and their relation to the Radon-Nikodym theorem of continuous functions.
Abstract: Preface Prologue: The Exponential Function Chapter 1: Abstract Integration Set-theoretic notations and terminology The concept of measurability Simple functions Elementary properties of measures Arithmetic in [0, ] Integration of positive functions Integration of complex functions The role played by sets of measure zero Exercises Chapter 2: Positive Borel Measures Vector spaces Topological preliminaries The Riesz representation theorem Regularity properties of Borel measures Lebesgue measure Continuity properties of measurable functions Exercises Chapter 3: Lp-Spaces Convex functions and inequalities The Lp-spaces Approximation by continuous functions Exercises Chapter 4: Elementary Hilbert Space Theory Inner products and linear functionals Orthonormal sets Trigonometric series Exercises Chapter 5: Examples of Banach Space Techniques Banach spaces Consequences of Baire's theorem Fourier series of continuous functions Fourier coefficients of L1-functions The Hahn-Banach theorem An abstract approach to the Poisson integral Exercises Chapter 6: Complex Measures Total variation Absolute continuity Consequences of the Radon-Nikodym theorem Bounded linear functionals on Lp The Riesz representation theorem Exercises Chapter 7: Differentiation Derivatives of measures The fundamental theorem of Calculus Differentiable transformations Exercises Chapter 8: Integration on Product Spaces Measurability on cartesian products Product measures The Fubini theorem Completion of product measures Convolutions Distribution functions Exercises Chapter 9: Fourier Transforms Formal properties The inversion theorem The Plancherel theorem The Banach algebra L1 Exercises Chapter 10: Elementary Properties of Holomorphic Functions Complex differentiation Integration over paths The local Cauchy theorem The power series representation The open mapping theorem The global Cauchy theorem The calculus of residues Exercises Chapter 11: Harmonic Functions The Cauchy-Riemann equations The Poisson integral The mean value property Boundary behavior of Poisson integrals Representation theorems Exercises Chapter 12: The Maximum Modulus Principle Introduction The Schwarz lemma The Phragmen-Lindelof method An interpolation theorem A converse of the maximum modulus theorem Exercises Chapter 13: Approximation by Rational Functions Preparation Runge's theorem The Mittag-Leffler theorem Simply connected regions Exercises Chapter 14: Conformal Mapping Preservation of angles Linear fractional transformations Normal families The Riemann mapping theorem The class L Continuity at the boundary Conformal mapping of an annulus Exercises Chapter 15: Zeros of Holomorphic Functions Infinite Products The Weierstrass factorization theorem An interpolation problem Jensen's formula Blaschke products The Muntz-Szas theorem Exercises Chapter 16: Analytic Continuation Regular points and singular points Continuation along curves The monodromy theorem Construction of a modular function The Picard theorem Exercises Chapter 17: Hp-Spaces Subharmonic functions The spaces Hp and N The theorem of F. and M. Riesz Factorization theorems The shift operator Conjugate functions Exercises Chapter 18: Elementary Theory of Banach Algebras Introduction The invertible elements Ideals and homomorphisms Applications Exercises Chapter 19: Holomorphic Fourier Transforms Introduction Two theorems of Paley and Wiener Quasi-analytic classes The Denjoy-Carleman theorem Exercises Chapter 20: Uniform Approximation by Polynomials Introduction Some lemmas Mergelyan's theorem Exercises Appendix: Hausdorff's Maximality Theorem Notes and Comments Bibliography List of Special Symbols Index

9,642 citations

Journal ArticleDOI
TL;DR: In this paper, two sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies and they are shown to be improvements over simple sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.
Abstract: Two types of sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies. These plans are shown to be improvements over simple random sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.

8,328 citations

Book
01 Jan 2009
TL;DR: The aim of this book is to provide a Discussion of Constrained Optimization and its Applications to Linear Programming and Other Optimization Problems.
Abstract: Preface Table of Notation Part 1: Unconstrained Optimization Introduction Structure of Methods Newton-like Methods Conjugate Direction Methods Restricted Step Methods Sums of Squares and Nonlinear Equations Part 2: Constrained Optimization Introduction Linear Programming The Theory of Constrained Optimization Quadratic Programming General Linearly Constrained Optimization Nonlinear Programming Other Optimization Problems Non-Smooth Optimization References Subject Index.

7,278 citations


"Sequential Experiment Design for Co..." refers methods in this paper

  • ...The Matlab Optimization Toolbox function fmincon, on the other hand, attempts to find a constrained optimum using a sequential quadratic programming method (Schittkowski 1985; Fletcher 1987)....

    [...]