scispace - formally typeset
Search or ask a question
Author

Gerald S. Shedler

Other affiliations: Stanford University
Bio: Gerald S. Shedler is an academic researcher from IBM. The author has contributed to research in topics: Stochastic Petri net & Petri net. The author has an hindex of 21, co-authored 51 publications receiving 2044 citations. Previous affiliations of Gerald S. Shedler include Stanford University.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a simple and relatively efficient method for simulating one-dimensional and two-dimensional nonhomogeneous Poisson processes is presented, which is applicable for any rate function and is based on controlled deletion of points in a Poisson process whose rate function dominates the given rate function.
Abstract: : A simple and relatively efficient method for simulating one- dimensional and two-dimensional nonhomogeneous Poisson processes is presented. The method is applicable for any rate function and is based on controlled deletion of points in a Poisson process whose rate function dominates the given rate function. In its simplest implementation, the method obviates the need for numerical integration of the rate function, for ordering of points, and for generation of Poisson variates.

890 citations

Book
15 Oct 1992
TL;DR: This paper presents Discrete-Event Simulations with Simultaneous Events for Regenerative Stochastic Processes, a new type of Regenerative Simulation, which addresses the challenge of directly simulating the response of the immune system to injury.
Abstract: Preface. Discrete-Event Simulations. Regenerative Stochastic Processes. Regenerative Simulation. Networks of Queues. Passage Times. Simulations With Simultaneous Events. Appendix A. Limit Theorems for Stochastic Processes. Appendix B. Random Number Generation.

139 citations

Journal ArticleDOI
TL;DR: This paper describes methods, both old and new, for the statistical analysis of non-stationary univariate stochastic point processes and sequences of positive random variables in computer systems.
Abstract: Central problems in the performance evaluation of computer systems are the description of the behavior of the system and characterization of the workload. One approach to these problems comprises the interactive combination of data-analytic procedures with probability modeling. This paper describes methods, both old and new, for the statistical analysis of non-stationary univariate stochastic point processes and sequences of positive random variables. Such processes arefr equently encountered in computer systems. As an illustration of the methodology an analysis is given of the stochastic point process of transactions initiated in a running data base system. On theb asis of the statistical analysis, a non-homogeneous Poissonp rocess model for the transaction initiation process is postulated for periods of high system activity and found to be an adequate characterization of the data. For periods of lower system activity, the transaction initiation process has a complex structure, with more clustering evident. Overall models of this type have application to the validation of proposed data base subsystem models.

93 citations

Patent
29 Jun 1979
TL;DR: In this paper, the authors present an approach and method for enabling asynchronous, collision-free communication between ports on a local shared bus network which is efficient in the use of bus bandwidth and which, in one embodiment, provides a bounded, guaranteed time to transmission for each port.
Abstract: Apparatus and method for enabling asynchronous, collision-free communication between ports on a local shared bus network which is efficient in the use of bus bandwidth and which, in one embodiment, provides a bounded, guaranteed time to transmission for each port.

68 citations

Book
28 Feb 1980
TL;DR: In this paper, the authors present a simulation of regenerative processes in closed networks of queues and a Marked Job Simulation via hitting times (MJSS) method for multiple job types.
Abstract: 1.0 Introduction.- 2.0 Simulation of regenerative processes.- 3.0 Closed networks of queues.- 4.0 The marked job method.- 5.0 Examples and simulation results.- 6.0 Finite capacity open networks of queues.- 7.0 Marked job simulation via hitting times.- 8.0 The decomposition method.- 9.0 Efficiency of simulation.- 10.0 Networks with multiple job types.- 11.0 Implementation considerations.

64 citations


Cited by
More filters
Journal ArticleDOI
01 Apr 1989
TL;DR: The author proceeds with introductory modeling examples, behavioral and structural properties, three methods of analysis, subclasses of Petri nets and their analysis, and one section is devoted to marked graphs, the concurrent system model most amenable to analysis.
Abstract: Starts with a brief review of the history and the application areas considered in the literature. The author then proceeds with introductory modeling examples, behavioral and structural properties, three methods of analysis, subclasses of Petri nets and their analysis. In particular, one section is devoted to marked graphs, the concurrent system model most amenable to analysis. Introductory discussions on stochastic nets with their application to performance modeling, and on high-level nets with their application to logic programming, are provided. Also included are recent results on reachability criteria. Suggestions are provided for further reading on many subject areas of Petri nets. >

10,755 citations

Journal ArticleDOI
TL;DR: Convergence of Probability Measures as mentioned in this paper is a well-known convergence of probability measures. But it does not consider the relationship between probability measures and the probability distribution of probabilities.
Abstract: Convergence of Probability Measures. By P. Billingsley. Chichester, Sussex, Wiley, 1968. xii, 253 p. 9 1/4“. 117s.

5,689 citations

Journal ArticleDOI
TL;DR: This chapter reviews the main methods for generating random variables, vectors and processes in non-uniform random variate generation, and provides information on the expected time complexity of various algorithms before addressing modern topics such as indirectly specified distributions, random processes, and Markov chain methods.

3,304 citations

Book
16 Apr 1986
TL;DR: A survey of the main methods in non-uniform random variate generation can be found in this article, where the authors provide information on the expected time complexity of various algorithms, before addressing modern topics such as indirectly specified distributions, random processes and Markov chain methods.
Abstract: This is a survey of the main methods in non-uniform random variate generation, and highlights recent research on the subject. Classical paradigms such as inversion, rejection, guide tables, and transformations are reviewed. We provide information on the expected time complexity of various algorithms, before addressing modern topics such as indirectly specified distributions, random processes, and Markov chain methods. Authors’ address: School of Computer Science, McGill University, 3480 University Street, Montreal, Canada H3A 2K6. The authors’ research was sponsored by NSERC Grant A3456 and FCAR Grant 90-ER-0291. 1. The main paradigms The purpose of this chapter is to review the main methods for generating random variables, vectors and processes. Classical workhorses such as the inversion method, the rejection method and table methods are reviewed in section 1. In section 2, we discuss the expected time complexity of various algorithms, and give a few examples of the design of generators that are uniformly fast over entire families of distributions. In section 3, we develop a few universal generators, such as generators for all log concave distributions on the real line. Section 4 deals with random variate generation when distributions are indirectly specified, e.g, via Fourier coefficients, characteristic functions, the moments, the moment generating function, distributional identities, infinite series or Kolmogorov measures. Random processes are briefly touched upon in section 5. Finally, the latest developments in Markov chain methods are discussed in section 6. Some of this work grew from Devroye (1986a), and we are carefully documenting work that was done since 1986. More recent references can be found in the book by Hörmann, Leydold and Derflinger (2004). Non-uniform random variate generation is concerned with the generation of random variables with certain distributions. Such random variables are often discrete, taking values in a countable set, or absolutely continuous, and thus described by a density. The methods used for generating them depend upon the computational model one is working with, and upon the demands on the part of the output. For example, in a ram (random access memory) model, one accepts that real numbers can be stored and operated upon (compared, added, multiplied, and so forth) in one time unit. Furthermore, this model assumes that a source capable of producing an i.i.d. (independent identically distributed) sequence of uniform [0, 1] random variables is available. This model is of course unrealistic, but designing random variate generators based on it has several advantages: first of all, it allows one to disconnect the theory of non-uniform random variate generation from that of uniform random variate generation, and secondly, it permits one to plan for the future, as more powerful computers will be developed that permit ever better approximations of the model. Algorithms designed under finite approximation limitations will have to be redesigned when the next generation of computers arrives. For the generation of discrete or integer-valued random variables, which includes the vast area of the generation of random combinatorial structures, one can adhere to a clean model, the pure bit model, in which each bit operation takes one time unit, and storage can be reported in terms of bits. Typically, one now assumes that an i.i.d. sequence of independent perfect bits is available. In this model, an elegant information-theoretic theory can be derived. For example, Knuth and Yao (1976) showed that to generate a random integer X described by the probability distribution {X = n} = pn, n ≥ 1, any method must use an expected number of bits greater than the binary entropy of the distribution, ∑

3,217 citations

Journal ArticleDOI
TL;DR: Specific aspects of cache memories investigated include: the cache fetch algorithm (demand versus prefetch), the placement and replacement algorithms, line size, store-through versus copy-back updating of main memory, cold-start versus warm-start miss ratios, mulhcache consistency, the effect of input /output through the cache, the behavior of split data/instruction caches, and cache size.
Abstract: design issues. Specific aspects of cache memories tha t are investigated include: the cache fetch algorithm (demand versus prefetch), the placement and replacement algorithms, line size, store-through versus copy-back updating of main memory, cold-start versus warm-start miss ratios, mulhcache consistency, the effect of input /output through the cache, the behavior of split data/instruction caches, and cache size. Our discussion includes other aspects of memory system architecture, including translation lookaside buffers. Throughout the paper, we use as examples the implementation of the cache in the Amdahl 470V/6 and 470V/7, the IBM 3081, 3033, and 370/168, and the DEC VAX 11/780. An extensive bibliography is provided.

1,614 citations