scispace - formally typeset
Search or ask a question
Topic

Phase-type distribution

About: Phase-type distribution is a research topic. Over the lifetime, 777 publications have been published within this topic receiving 22970 citations.


Papers
More filters
MonographDOI
TL;DR: This chapter discusses quasi-Birth-and-Death Processes, a large number of which are based on the Markovian Point Processes and the Matrix-Geometric Distribution, as well as algorithms for the Rate Matrix.
Abstract: Preface Part I. Quasi-Birth-and-Death Processes. 1. Examples Part II. The Method of Phases. 2. PH Distributions 3. Markovian Point Processes Part III. The Matrix-Geometric Distribution. 4. Birth-and-Death Processes 5. Processes Under a Taboo 6. Homogeneous QBDs 7. Stability Condition Part IV. Algorithms. 8. Algorithms for the Rate Matrix 9. Spectral Analysis 10. Finite QBDs 11. First Passage Times Part V. Beyond Simple QBDs. 12. Nonhomogeneous QBDs 13. Processes, Skip-Free in One Direction 14. Tree Processes 15. Product Form Networks 16. Nondenumerable States Bibliography Index.

1,940 citations

Journal ArticleDOI
TL;DR: In this article, the authors define Markov Chains as a model of transition probability matrices of Markov chains, and describe the long run behavior of these matrices with respect to different types of states.
Abstract: Stochastic Modeling. Probability Review. The Major Discrete Distributions. @rtant Continuous Distributions. Some Elementary Exercises. Useful Functions, Integrals, and Sums. Conditional Probability and Conditional Expectation: The Discrete Case. The Dice Game Craps. Random Sums. Conditioning on a Continuous Random Variable. Markov Chains: Introduction: Definitions. Transition Probability Matrices of a Markov Chain. Some Markov Chain Models. First Step Analysis. Some Special Markov Chains. Functionals of Random Walks and Success Runs. Another Look at First Step Analysis. The Long Run Behavior of Markov Chains: Regular Transition Probability Matrices. Examples. The Classification of States. The Basic Limit Theorem of Markov Chains. Reducible Markov Chains. Sequential Decisions and Markov Chains. Poisson Processes: The Poisson Distribution and the Poisson Processes. The Law of Rare Events. Distributions Associated with the Poisson Process. The Uniform Distribution and Poisson Processes. Spatial Poisson Processes. Compound and Marked Poisson Processes. Continuous Time Markov Chains: Pure Birth Processes. Ptire Death Processes. Birth and Death Processes. The Limiting Behavior of Birth and Death Processes. Birth and Death Processes with Absorbing States. Finite State Continuous Time Markov Chains. Set Valued Processes. Renewal Phenomena: Definition of a Renewal Process and Related Concepts. Some Examples of Renewal Processes. The Poisson Process Viewed as a Renewal Process. The Asymptotic ]3ehavior as Renewal Process.

1,257 citations

Journal ArticleDOI
TL;DR: In this paper, a continuous-time method of approximating a given distribution π using the Langevin diffusion d L t=dW t+1 2 ∇ logπ(L t)dt was considered, and conditions under which this diffusion converges exponentially quickly to π or does not.
Abstract: In this paper we consider a continuous-time method of approximating a given distribution π using the Langevin diffusion d L t=dW t+1 2 ∇logπ(L t)dt . We find conditions under which this diffusion converges exponentially quickly to π or does not: in one dimension, these are essentially that for distributions with exponential tails of the form π (x)∝exp(-γ|x| β ) , 0 <β<∞ , exponential convergence occurs if and only if β ≥1 . We then consider conditions under which the discrete approximations to the diffusion converge. We first show that even when the diffusion itself converges, naive discretizations need not do so. We then consider a 'Metropolis-adjusted' version of the algorithm, and find conditions under which this also converges at an exponential rate: perhaps surprisingly, even the Metropolized version need not converge exponentially fast even if the diffusion does. We briefly discuss a truncated form of the algorithm which, in practice, should avoid the difficulties of the other forms.

1,126 citations

Journal ArticleDOI
TL;DR: In this paper, it was shown that the equilibrium distribution of a queue is a geometric series mixed with a concentration at 0, and that for a queue of size s = 1, 2 and 3, the equilibrium is a negative-exponential distribution mixed with concentration at n = 0.
Abstract: The stochastic processes which occur in the theory of queues are in general not Markovian and special methods are required for their analysis. In many cases the problem can be greatly simplified by restricting attention to an imbedded Markov chain. In this paper some recent work on single-server queues is first reviewed from this standpoint, and the method is then applied to the analysis of the following many-server queuing-system: Input: the inter-arrival times are independently and identically distributed in an arbitrary manner. Queue-discipline: "first come, first served." Service-mechanism: a general number, $s$, of servers; negative-exponential service-times. If $Q$ is the number of people waiting at an instant just preceding the arrival of a new customer, and if $w$ is the waiting time of an arbitrary customer, then it will be shown that the equilibrium distribution of $Q$ is a geometric series mixed with a concentration at $Q = 0$ and that the equilibrium distribution of $w$ is a negative-exponential distribution mixed with a concentration at $w = 0$. (In the particular case of a single server this property of the waiting-time distribution was first discovered by W. L. Smith.) The paper concludes with detailed formulae and numerical results for the following particular cases: Numbers of servers: s = 1, 2 and 3. Types of input: (i) Poissonian and (ii) regular.

988 citations

Journal ArticleDOI
TL;DR: The purpose of this paper is to collect a number of useful results about Markov-modulated Poisson processes and queues with Markov -modulated input and to summary of recent developments.

882 citations


Network Information
Related Topics (5)
Markov chain
51.9K papers, 1.3M citations
80% related
Estimator
97.3K papers, 2.6M citations
74% related
Stochastic process
31.2K papers, 898.7K citations
72% related
Linear programming
32.1K papers, 920.3K citations
72% related
Nonparametric statistics
19.9K papers, 844.1K citations
70% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20235
202220
202113
202020
201911
201814