scispace - formally typeset
Search or ask a question

Showing papers by "Barry L. Nelson published in 2010"


Journal ArticleDOI
TL;DR: The basic theory of kriging is extended, as applied to the design and analysis of deterministic computer experiments, to the stochastic simulation setting to provide flexible, interpolation-based metamodels of simulation output performance measures as functions of the controllable design or decision variables.
Abstract: We extend the basic theory of kriging, as applied to the design and analysis of deterministic computer experiments, to the stochastic simulation setting. Our goal is to provide flexible, interpolation-based metamodels of simulation output performance measures as functions of the controllable design or decision variables, or uncontrollable environmental variables. To accomplish this, we characterize both the intrinsic uncertainty inherent in a stochastic simulation and the extrinsic uncertainty about the unknown response surface. We use tractable examples to demonstrate why it is critical to characterize both types of uncertainty, derive general results for experiment design and analysis, and present a numerical example that illustrates the stochastic kriging method.

576 citations


Journal ArticleDOI
TL;DR: Small-sample validity of the statistical test and ranking-and-selection procedure is proven for normally distributed data, and ISC is compared to the commercial optimization via simulation package OptQuest on five test problems that range from 2 to 20 decision variables and on the order of 104 to 1020 feasible solutions.
Abstract: Industrial Strength COMPASS (ISC) is a particular implementation of a general framework for optimizing the expected value of a performance measure of a stochastic simulation with respect to integer-ordered decision variables in a finite (but typically large) feasible region defined by linear-integer constraints. The framework consists of a global-search phase, followed by a local-search phase, and ending with a “clean-up” (selection of the best) phase. Each phase provides a probability 1 convergence guarantee as the simulation effort increases without bound: Convergence to a globally optimal solution in the global-search phase; convergence to a locally optimal solution in the local-search phase; and convergence to the best of a small number of good solutions in the clean-up phase. In practice, ISC stops short of such convergence by applying an improvement-based transition rule from the global phase to the local phase; a statistical test of convergence from the local phase to the clean-up phase; and a ranking-and-selection procedure to terminate the clean-up phase. Small-sample validity of the statistical test and ranking-and-selection procedure is proven for normally distributed data. ISC is compared to the commercial optimization via simulation package OptQuest on five test problems that range from 2 to 20 decision variables and on the order of 104 to 1020 feasible solutions. These test cases represent response-surface models with known properties and realistic system simulation problems.

155 citations


Journal ArticleDOI
TL;DR: A two-level simulation procedure is developed that produces a confidence interval for expected shortfall using the statistical theory of empirical likelihood and tools from the ranking-and-selection literature to make the simulation efficient.
Abstract: We develop and evaluate a two-level simulation procedure that produces a confidence interval for expected shortfall. The outer level of simulation generates financial scenarios, whereas the inner level estimates expected loss conditional on each scenario. Our procedure uses the statistical theory of empirical likelihood to construct a confidence interval. It also uses tools from the ranking-and-selection literature to make the simulation efficient.

58 citations


01 Jan 2010
TL;DR: This tutorial describes some of the research directions and results available for discrete-decision-variable OvS, and provides some guidance for using the OvS heuristics that are built into simulation modeling software.
Abstract: Both the simulation research and software communities have been interested in optimization via simulation (OvS), by which we mean maximizing or minimizing the expected value of some output of a stochastic simulation. Continuous-decision-variable OvS, and gradient estimation to support it, has been an active research area with significant advances. However, the decision variables in many operations research and management science simulations are more naturally discrete, even categorical. In this tutorial we describe some of the research directions and results available for discrete-decision-variable OvS, and provide some guidance for using the OvS heuristics that are built into simulation modeling software.

52 citations


Journal ArticleDOI
TL;DR: A new fully sequential hypothesis-testing procedure is introduced that greatly improves the efficiency of CSB and a more general CSB procedure is proposed that has the same error control for screening main effects that CSB does, even when two-factor interactions are present.
Abstract: Controlled sequential bifurcation (CSB) is a factor-screening method for discrete-event simulations. It combines a multistage hypothesis testing procedure with the original sequential bifurcation procedure to control both the power for detecting important effects at each bifurcation step and the Type I error for each unimportant factor under heterogeneous variance conditions when a main-effects model applies. This paper improves the CSB procedure in two aspects. First, a new fully sequential hypothesis-testing procedure is introduced that greatly improves the efficiency of CSB. Moreover, this paper proposes CSB-X, a more general CSB procedure that has the same error control for screening main effects that CSB does, even when two-factor interactions are present. The performance of the new method is proven and compared with the original CSB procedure.

47 citations


Proceedings ArticleDOI
05 Dec 2010
TL;DR: To achieve this goal, metamodel-assisted bootstrapping is introduced, and its performance relative to other proposals for dealing with input uncertainty on two queueing examples is illustrated.
Abstract: We consider the problem of producing confidence intervals for the mean response of a system represented by a stochastic simulation that is driven by input models that have been estimated from "real-world" data. Therefore, we want the confidence interval to account for both uncertainty about the input models and stochastic noise in the simulation output; standard practice only accounts for the stochastic noise. To achieve this goal we introduce metamodel-assisted bootstrapping, and illustrate its performance relative to other proposals for dealing with input uncertainty on two queueing examples.

35 citations


Proceedings ArticleDOI
05 Dec 2010
TL;DR: This paper will compare various correlation functions in both spatial and frequency domains, and analyze the influence of the choice of correlation function on prediction accuracy by experimenting with three tractable examples with differentiable and non-differentiable response surfaces.
Abstract: The correlation function plays a critical role in both kriging and stochastic kriging metamodels. This paper will compare various correlation functions in both spatial and frequency domains, and analyze the influence of the choice of correlation function on prediction accuracy by experimenting with three tractable examples with differentiable and non-differentiable response surfaces: the M/M/1 queue, multi-product M/G/1 queue and 3-station Jackson network. The twice or higher-order continuously differentiable correlation functions demonstrate a promising capability to fit both differentiable and non-differentiable multi-dimensional response surfaces.

26 citations


Journal ArticleDOI
TL;DR: This paper proposes a simple change to the solution-sampling scheme that significantly speeds up COMPASS for high-dimensional problems without affecting its convergence guarantee.

20 citations


Proceedings ArticleDOI
05 Dec 2010
TL;DR: A computationally efficient simulation procedure for point estimation of expected shortfall with a much lower mean squared error than a standard simulation procedure and much more precisely than an existing interval estimation procedure.
Abstract: We present a computationally efficient simulation procedure for point estimation of expected shortfall. The procedure applies tools for ranking and selection to allocate more computational resources to estimation of the largest losses, which are those that affect expected shortfall. Given a fixed computational budget, our procedure estimates expected shortfall with a much lower mean squared error than a standard simulation procedure and much more precisely than an existing interval estimation procedure.

12 citations


Proceedings ArticleDOI
05 Dec 2010
TL;DR: A collection of simple models are used to examine the interaction between the variance reduction technique of common random numbers and a new simulation metamodeling technique called stochastic kriging, which considers the impact of commonrandom numbers on prediction, parameter estimation and gradient estimation.
Abstract: We use a collection of simple models to examine the interaction between the variance reduction technique of common random numbers and a new simulation metamodeling technique called stochastic kriging. We consider the impact of common random numbers on prediction, parameter estimation and gradient estimation.

11 citations


Proceedings ArticleDOI
05 Dec 2010
TL;DR: A cross-validation method that adds design points and simulation effort at the design points to target all metamodels' relative prediction errors and a way to choose design points so that the scenario is likely to fall inside their convex hull is proposed.
Abstract: We develop a sequential experiment design procedure to construct multiple metamodels based on a single stochastic simulation model. We apply the procedure to approximate many securities' prices as functions of a financial scenario. We propose a cross-validation method that adds design points and simulation effort at the design points to target all metamodels' relative prediction errors. To improve the expected quality of the metamodels given randomness of the scenario that is an input to the simulation model, we also propose a way to choose design points so that the scenario is likely to fall inside their convex hull.

Proceedings ArticleDOI
05 Dec 2010
TL;DR: This paper presents initial work toward the development of a ranking and selection procedure allowing comparisons between systems to be made based on any distributional property of interest, and demonstrates the use of fixed-width confidence intervals for bootstrapped R&S.
Abstract: A ranking and selection (R&S) procedure allowing comparisons between systems to be made based on any distributional property of interest would be useful. This paper presents initial work toward the development of such a procedure. Previously published work gives a method for using bootstrapping to develop fixed-width confidence intervals with a specified coverage probability around a property of interest. Empirical evidence is provided in support of the use of this approach for building fixed-width confidence intervals around both means and quantiles. Additionally, the use of fixed-width confidence intervals for bootstrapped R&S is demonstrated. For two systems, R&S is performed by building a confidence interval around the difference between two systems. Simultaneous fixed-width confidence intervals are used for R&S on more than 2 systems, and the approach is demonstrated for three systems. The technique is shown to be effective for R&S based on both quantiles and means.

Journal ArticleDOI
TL;DR: This paper describes how simulation was employed in the concept development phase to assess whether production targets required for financial viability were feasible and to identify the critical features of the line on which to focus design-improvement efforts.
Abstract: In this paper, we describe how Delphi Corporation used simulation in the concept-development phase of a new multimillion-dollar fuel injector production line. Delphi wanted to assess the financial viability of production targets and identify the critical features of the line on which it would focus its design-improvement efforts.

Proceedings ArticleDOI
05 Dec 2010
TL;DR: A new method for discrete-decision-variable optimization via simulation that combines the stochastic branch-and-bound method and the nested partitions method in the sense that it takes advantage of the partitioning structure of stochastically branch and bound, but estimates the bounds based on the performance of sampled solutions as the nested partition method does.
Abstract: We introduce a new method for discrete-decision-variable optimization via simulation that combines the stochastic branch-and-bound method and the nested partitions method in the sense that we take advantage of the partitioning structure of stochastic branch and bound, but estimate the bounds based on the performance of sampled solutions as the nested partitions method does. Our Empirical Stochastic Branch-and-Bound algorithm also uses improvement bounds to guide solution sampling for better performance.