scispace - formally typeset
Search or ask a question
Author

Bahar Biller

Bio: Bahar Biller is an academic researcher from General Electric. The author has contributed to research in topics: Stochastic simulation & Independent and identically distributed random variables. The author has an hindex of 13, co-authored 41 publications receiving 628 citations. Previous affiliations of Bahar Biller include SAS Institute & Carnegie Mellon University.

Papers
More filters
Journal ArticleDOI
TL;DR: The central idea is to transform a Gaussian vector autoregressive process into the desired multivariate time-series input process that the authors presume as having a VARTA (Vector-Autoregressive-To-Anything) distribution.
Abstract: We present a model for representing stationary multivariate time-series input processes with marginal distributions from the Johnson translation system and an autocorrelation structure specified through some finite lag. We then describe how to generate data accurately to drive computer simulations. The central idea is to transform a Gaussian vector autoregressive process into the desired multivariate time-series input process that we presume as having a VARTA (Vector-Autoregressive-To-Anything) distribution. We manipulate the autocorrelation structure of the Gaussian vector autoregressive process so that we achieve the desired autocorrelation structure for the simulation input process. We call this the correlation-matching problem and solve it by an algorithm that incorporates a numerical-search procedure and a numerical-integration technique. An illustrative example is included.

131 citations

Journal ArticleDOI
TL;DR: An automated and statistically valid algorithm is presented to fit autoregressive-to-anything (ARTA) processes with marginal distributions from the Johnson translation system to stationary univariate time-series data.
Abstract: Providing accurate and automated input-modeling support is one of the challenging problems in the application of computer simulation of stochastic systems. The models incorporated in current input-modeling software packages often fall short because they assume independent and identically distributed processes, even though dependent time-series input processes occur naturally in the simulation of many real-life systems. Therefore, this paper introduces a statistical methodology for fitting stochastic models to dependent time-series input processes. Specifically, an automated and statistically valid algorithm is presented to fit autoregressive-to-anything (ARTA) processes with marginal distributions from the Johnson translation system to stationary univariate time-series data. ARTA processes are particularly well suited to driving stochastic simulations. The use of this algorithm is illustrated with examples.

68 citations

Journal ArticleDOI
TL;DR: The Bayesian model is incorporated into the simulation replication algorithm for the joint representation of stochastic uncertainty and parameter uncertainty in the mean performance estimate and the confidence interval and shows that the model improves both the consistency of the mean line-item fill-rate estimates and the coverage of the confidence intervals in multiproduct inventory simulations with correlated demands.
Abstract: This paper considers large-scale stochastic simulations with correlated inputs having normal-to-anything (NORTA) distributions with arbitrary continuous marginal distributions. Examples of correlated inputs include processing times of workpieces across several workcenters in manufacturing facilities and product demands and exchange rates in global supply chains. Our goal is to obtain mean performance measures and confidence intervals for simulations with such correlated inputs by accounting for the uncertainty around the NORTA distribution parameters estimated from finite historical input data. This type of uncertainty is known as the parameter uncertainty in the discrete-event stochastic simulation literature. We demonstrate how to capture parameter uncertainty with a Bayesian model that uses Sklar's marginal-copula representation and Cooke's copula-vine specification for sampling the parameters of the NORTA distribution. The development of such a Bayesian model well suited for handling many correlated inputs is the primary contribution of this paper. We incorporate the Bayesian model into the simulation replication algorithm for the joint representation of stochastic uncertainty and parameter uncertainty in the mean performance estimate and the confidence interval. We show that our model improves both the consistency of the mean line-item fill-rate estimates and the coverage of the confidence intervals in multiproduct inventory simulations with correlated demands.

55 citations

Journal ArticleDOI
TL;DR: This paper considers a repeated newsvendor setting where this is not the case and studies the problem of setting inventory targets when there is a limited amount of historical demand data, to quantify the inaccuracy in the inventory-target estimation as a function of the length of the historicalDemand data, the critical fractile, and the shape parameters of the demand distribution.
Abstract: Most of the literature on inventory management assumes that the demand distribution and the values of its parameters are known with certainty. In this paper, we consider a repeated newsvendor setting where this is not the case and study the problem of setting inventory targets when there is a limited amount of historical demand data. Consequently, we achieve the following objectives: (1) to quantify the inaccuracy in the inventory-target estimation as a function of the length of the historical demand data, the critical fractile, and the shape parameters of the demand distribution; and (2) to determine the inventory target that minimizes the expected cost and accounts for the uncertainty around the demand parameters estimated from limited historical data. We achieve these objectives by using the concept of expected total operating cost and representing the demand distribution with the highly flexible Johnson translation system. Our procedures require no restrictive assumptions about the first four moments of the demand random variables, and they can be easily implemented in practical settings with reduced expected total operating costs.

51 citations

Journal ArticleDOI
TL;DR: A copula-based multivariate time-series input model, which includes VARTA as a special case, allows the development of statistically valid fitting and fast sampling algorithms well suited for driving large-scale stochastic simulations.
Abstract: As large-scale discrete-event stochastic simulation becomes a tool that is used routinely for the design and analysis of stochastic systems, the need for input-modeling support with the ability to represent complex interactions and interdependencies among the components of multivariate time-series input processes is more critical than ever. Motivated by the failure of independent and identically distributed random variables to represent such input processes, a comprehensive framework called Vector-Autoregressive-To-Anything (VARTA) has been introduced for multivariate time-series input modeling. Despite its flexibility in capturing a wide variety of distributional shapes, we show that VARTA falls short in representing dependence structures that arise in situations where extreme component realizations occur together. We demonstrate that it is possible to extend VARTA to work for such dependence structures via the use of the copula theory, which has been used primarily for random vectors in the simulation input-modeling literature, for multivariate time-series input modeling. We show that our copula-based multivariate time-series input model, which includes VARTA as a special case, allows the development of statistically valid fitting and fast sampling algorithms well suited for driving large-scale stochastic simulations.

48 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: The analysis of time series: An Introduction, 4th edn. as discussed by the authors by C. Chatfield, C. Chapman and Hall, London, 1989. ISBN 0 412 31820 2.
Abstract: The Analysis of Time Series: An Introduction, 4th edn. By C. Chatfield. ISBN 0 412 31820 2. Chapman and Hall, London, 1989. 242 pp. £13.50.

1,583 citations

Book
22 Sep 2014
TL;DR: This nontechnical textbook is focused towards the needs of business, engineering and computer science students and aims to improve efficiency and effectiveness in simulation modelling.
Abstract: Simulation modelling involves the development of models that imitate realworld operations, and statistical analysis of their performance with a view to improving efficiency and effectiveness This nontechnical textbook is focused towards the needs of business, engineering and computer science students

1,019 citations

Book
01 Sep 2014
TL;DR: It is quite impossible to include in a single volume of reasonable size, an adequate and exhaustive discussion of the calculus in its more advanced stages, so it becomes necessary, in planning a thoroughly sound course in the subject, to consider several important aspects of the vast field confronting a modern writer.
Abstract: WITH the ever-widening scope of modern mathematical analysis and its many ramifications, it is quite impossible to include, in a single volume of reasonable size, an adequate and exhaustive discussion of the calculus in its more advanced stages. It therefore becomes necessary, in planning a thoroughly sound course in the subject, to consider several important aspects of the vast field confronting a modern writer. The limitation of space renders the selection of subject-matter fundamentally dependent upon the aim of the course, which may or may not be related to the content of specific examination syllabuses. Logical development, too, may lead to the inclusion of many topics which, at present, may only be of academic interest, while others, of greater practical value, may have to be omitted. The experience and training of the writer may also have, more or less, a bearing on both these considerations.Advanced CalculusBy Dr. C. A. Stewart. Pp. xviii + 523. (London: Methuen and Co., Ltd., 1940.) 25s.

881 citations