scispace - formally typeset
Search or ask a question

Showing papers by "Barry L. Nelson published in 2001"


Journal ArticleDOI
TL;DR: The procedures presented are appropriate when it is possible to repeatedly obtain small, incremental samples from each simulated system and are based on the assumption of normally distributed data, so the impact of batching is analyzed.
Abstract: We present procedures for selecting the best or near-best of a finite number of simulated systems when best is defined by maximum or minimum expected performance. The procedures are appropriate when it is possible to repeatedly obtain small, incremental samples from each simulated system. The goal of such a sequential procedure is to eliminate, at an early stage of experimentation, those simulated systems that are apparently inferior, and thereby reduce the overall computational effort required to find the best. The procedures we present accommodate unequal variances across systems and the use of common random numbers. However, they are based on the assumption of normally distributed data, so we analyze the impact of batching (to achieve approximate normality or independence) on the performance of the procedures. Comparisons with some existing indifference-zone procedures are also provided.

422 citations


Journal ArticleDOI
TL;DR: The approach is to use the data provided by the first stage of sampling in an R&S procedure to screen out alternatives that are not competitive, and thereby avoid the (typically much larger) second stage sample for these systems.
Abstract: In this paper, we address the problem of finding the simulated system with the best (maximum or minimum) expected performance when the number of alternatives is finite, but large enough that ranking-and-selection (R&S) procedures may require too much computation to be practical. Our approach is to use the data provided by the first stage of sampling in an R&S procedure to screen out alternatives that are not competitive, and thereby avoid the (typically much larger) second-stage sample for these systems. Our procedures represent a compromise between standard R&S procedures--which are easy to implement, but can be computationally inefficient--and fully sequential procedures--which can be statistically efficient, but are more difficult to implement and depend on more restrictive assumptions. We present a general theory for constructing combined screening and indifference-zone selection procedures, several specific procedures and a portion of an extensive empirical evaluation.

313 citations


Journal ArticleDOI
TL;DR: This paper provides two-stage experiment design and analysis procedures to solve the problem of comparing a finite number of stochastic systems with respect to a single system via simulation experiments and provides methods for estimating the critical constants required by the procedures.
Abstract: We consider the problem of comparing a finite number of stochastic systems with respect to a single system (designated as the "standard") via simulation experiments. The comparison is based on expected performance, and the goal is to determine if any system has larger expected performance than the standard, and if so to identify the best of the alternatives. In this paper we provide two-stage experiment design and analysis procedures to solve the problem for a variety of scenarios, including those in which we encounter unequal variances across systems, as well as those in which we use the variance reduction technique of common random numbersand it is appropriate to do so. The emphasis is added because in some cases common random numbers can be counterproductive when performing comparisons with a standard. We also provide methods for estimating the critical constants required by our procedures, present a portion of an extensive empirical study, and demonstrate one of the procedures via a numerical example.

91 citations


Proceedings ArticleDOI
09 Dec 2001
TL;DR: This tutorial discusses some statistical procedures for selecting the best of a number of competing systems, some of which assume that the underlying observations arise from competing normal distributions, and some ofWhich are essentially nonparametric in nature.
Abstract: This tutorial discusses some statistical procedures for selecting the best of a number of competing systems. The term "best" may refer to that simulated system having, say, the largest expected value or the greatest likelihood of yielding a large observation. We describe various procedures for finding the best, some of which assume that the underlying observations arise from competing normal distributions, and some of which are essentially nonparametric in nature. In each case, we comment on how to apply the above procedures for use in simulations.

38 citations


Proceedings ArticleDOI
09 Dec 2001
TL;DR: Sequential Selection with Memory (SSM) as discussed by the authors selects the best or near-best alternative with a user-specified probability when some solutions have already been sampled and their previous samples are retained.
Abstract: We propose fully sequential indifference-zone selection procedures that are specifically for use within an optimization-via-simulation algorithm when simulation is costly and partial or complete information on solutions previously visited is maintained. Sequential Selection with Memory guarantees to select the best or near-best alternative with a user-specified probability when some solutions have already been sampled and their previous samples are retained. For the case when only summary information is retained, we derive a modified procedure. We illustrate how our procedure can be applied to optimization-via-simulation problems and compare its performance with other methods by numerical examples.

31 citations


Journal ArticleDOI
TL;DR: Two-stage experiment designs for use in simulation experiments that compare systems in terms of their expected (long-run average) performance and identify a subset of systems that are more than a practically insignificant difference from the best.
Abstract: We present two-stage experiment designs for use in simulation experiments that compare systems in terms of their expected (long-run average) performance. These procedures simultaneously achieve the following with a prespecified probability of being correct: (i) find the best system or a near-best system; (ii) identify a subset of systems that are more than a practically insignificant difference from the best; and (iii) provide a lower confidence bound on the probability that the best or near-best system will be selected. All of the procedures assume normally distributed data, but versions allow unequal variances and common random numbers.

26 citations


Proceedings ArticleDOI
01 Dec 2001
TL;DR: The authors present a general-purpose input-modeling tool for representing, fitting, and generating random variates from multivariate input processes to drive computer simulations.
Abstract: Providing accurate and automated input modeling support is one of the challenging problems in the application of computer simulation. The authors present a general-purpose input-modeling tool for representing, fitting, and generating random variates from multivariate input processes to drive computer simulations. We explain the theory underlying the suggested data fitting and data generation techniques, and demonstrate that our framework fits models accurately to both univariate and multivariate input processes.

26 citations


Proceedings ArticleDOI
01 Jan 2001
TL;DR: This work proposes fully sequential indifference-zone selection procedures that are specifically for use within an optimization-via-simulation algorithm when simulation is costly and partial or complete information on solutions previously visited is maintained.
Abstract: We propose fully sequential indifference-zone selection procedures that are specifically for use within an optimization-via-simulation algorithm when simulation is costly and partial or complete information on solutions previously visited is maintained. Sequential Selection with Memory guarantees to select the best or near-best alternative with a user-specified probability when some solutions have already been sampled and their previous samples are retained. For the case when only summary information is retained, we derive a modified procedure. We illustrate how our procedure can be applied to optimization-via-simulation problems and compare its performance with other methods by numerical examples.

21 citations


Proceedings ArticleDOI
01 Jan 2001
TL;DR: This panel discusses goals and educational strategies for teaching simulation in academia and how to motivate and empower students to analyze complex problems correctly and to prevent the pitfall of misusing the concept.
Abstract: This panel discusses goals and educational strategies for teaching simulation in academia. Clearly, there is considerable material to cover in a single course or a sequence thereof in, say, an undergraduate program, The issue is how to motivate and empower students to analyze complex problems correctly and to prevent the pitfall of misusing the concept.

15 citations


Proceedings ArticleDOI
09 Dec 2001
TL;DR: In this article, the authors discuss goals and educational strategies for teaching simulation in academia and how to motivate and empower students to analyze complex problems correctly and to prevent the pitfall of misusing the concept.
Abstract: This panel discusses goals and educational strategies for teaching simulation in academia. Clearly, there is considerable material to cover in a single course or a sequence thereof in, say, an undergraduate program. The issue is how to motivate and empower students to analyze complex problems correctly and to prevent the pitfall of misusing the concept.

14 citations


Proceedings ArticleDOI
09 Dec 2001
TL;DR: In this article, a general-purpose input-modeling tool for representing, fitting, and generating random variates from multivariate input processes to drive computer simulations is presented, along with the theory underlying the suggested data fitting and data generation techniques.
Abstract: Providing accurate and automated input modeling support is one of the challenging problems in the application of computer simulation. In this paper, we present a general-purpose input-modeling tool for representing, fitting, and generating random variates from multivariate input processes to drive computer simulations. We explain the theory underlying the suggested data fitting and data generation techniques, and demonstrate that our framework fits models accurately to both univariate and multivariate input processes.