scispace - formally typeset
Search or ask a question

Showing papers on "Markov chain published in 1982"


Journal ArticleDOI
TL;DR: In this paper, a mathematical text suitable for students of engineering and science who are at the third year undergraduate level or beyond is presented, which is a book of applicable mathematics, which avoids the approach of listing only the techniques, followed by a few examples.
Abstract: This is a mathematical text suitable for students of engineering and science who are at the third year undergraduate level or beyond. It is a book of applicable mathematics. It avoids the approach of listing only the techniques, followed by a few examples, without explaining why the techniques work. Thus, it provides not only the know-how but also the know-why. Equally, the text has not been written as a book of pure mathematics with a list of theorems followed by their proofs. The authors' aim is to help students develop an understanding of mathematics and its applications. They have refrained from using clichés like “it is obvious” and “it can be shown”, which may be true only to a mature mathematician. On the whole, the authors have been generous in writing down all the steps in solving the example problems.The book comprises ten chapters. Each chapter contains several solved problems clarifying the introduced concepts. Some of the examples are taken from the recent literature and serve to illustrate the applications in various fields of engineering and science. At the end of each chapter, there are assignment problems with two levels of difficulty. A list of references is provided at the end of the book.This book is the product of a close collaboration between two mathematicians and an engineer. The engineer has been helpful in pinpointing the problems which engineering students encounter in books written by mathematicians.

2,846 citations


Journal ArticleDOI
TL;DR: In this article, a new Markov chain is introduced which can be used to describe the family relationships among n individuals drawn from a particular generation of a large haploid population, and the properties of this process can be studied, simultaneously for all n, by coupling techniques.
Abstract: A new Markov chain is introduced which can be used to describe the family relationships among n individuals drawn from a particular generation of a large haploid population. The properties of this process can be studied, simultaneously for all n, by coupling techniques. Recent results in neutral mutation theory are seen as consequences of the genealogy described by the chain.

1,495 citations


Journal ArticleDOI
TL;DR: Bounds of exponential-type bounds are derived for the first-hitting time and occupation times of a real-valued random sequence which has a uniform negative drift whenever the sequence is above a fixed level and geometric ergodicity is established for a certain Markov chain.
Abstract: Bounds of exponential type are derived for the first-hitting time and occupation times of a real-valued random sequence which has a uniform negative drift whenever the sequence is above a fixed level. The only other assumption on the random sequence is that the increments satisfy a uniform exponential decay condition. The bounds provide a flexible technique for proving stability of processes frequently encountered in the control of queues. Two applications are given. First, exponential-type bounds are derived for a GI/G/1 queue when the service distribution is exponential type. Secondly, geometric ergodicity is established for a certain Markov chain in which arises in the decentralized control of a multi-access, packet-switched broadcast channel.

455 citations


Journal ArticleDOI
L. Liporace1
TL;DR: Parameter estimation for multivariate functions of Markov chains, a class of versatile statistical models for vector random processes, is discussed, and a powerful representation theorem by Fan is employed to generalize the analysis of Baum, et al. to a larger class of distributions.
Abstract: Parameter estimation for multivariate functions of Markov chains, a class of versatile statistical models for vector random processes, is discussed. The model regards an ordered sequence of vectors as noisy multivariate observations of a Markov chain. Mixture distributions are a special case. The foundations of the theory presented here were established by Baum, Petrie, Soules, and Weiss. A powerful representation theorem by Fan is employed to generalize the analysis of Baum, {\em et al.} to a larger class of distributions.

414 citations


Journal ArticleDOI
TL;DR: In this paper, the economic theory of decision-making under uncertainty is used to produce three econometric models of dynamic discrete choice: (1) for a single spell of unemployment; (2) for an equilibrium two-state model of employment and non-employment; (3) for general three-state models with a non-market sector.

409 citations


Journal ArticleDOI
TL;DR: In this article, the Langevin equation was used to derive the Markov equation for the vertical velocity of a fluid particle moving in turbulent flow, and it was shown that if the Eulerian velocity variance Σ¯¯¯¯wE is not constant with height, there is an associated vertical pressure gradient which appears as a force-like term.
Abstract: The Langevin equation is used to derive the Markov equation for the vertical velocity of a fluid particle moving in turbulent flow. It is shown that if the Eulerian velocity variance Σ wE is not constant with height, there is an associated vertical pressure gradient which appears as a force-like term in the Markov equation. The correct form of the Markov equation is: w(t + δt) = aw(t) + bΣ wEζ + (1 − a)T L ∂(Σ wE 2)/∂z, where w(t) is the vertical velocity at time t, ζ a random number from a Gaussian distribution with zero mean and unit variance, T L the Lagrangian integral time scale for vertical velocity, a = exp(−δt/T L), and b = (1 − a 2)1/2. This equation can be used for inhomogeneous turbulence in which the mean wind speed, Σ wE and T L vary with height. A two-dimensional numerical simulation shows that when this equation is used, an initially uniform distribution of tracer remains uniform.

307 citations


Journal ArticleDOI
TL;DR: The problem of state estimation and system structure detection for discrete stochastic dynamical systems with parameters which may switch among a finite set of values is considered and a unified treatment of the existing suboptimal algorithms is provided.

284 citations


Book
01 Nov 1982
TL;DR: Markov random field theory provides a convenient and consistent way for modeling context dependent entities such as image pixels and correlated features and its use in image fusion is discussed.
Abstract: Let A1, ℬ, A2 be σ-algebras of events having the following relationship: if the outcomes of all events in ℬ are known, events A2 ∈A2 are independent of events A1 ∈ A1. More precisely, the σ-algebras A1 and A2 are conditionally independent with respect to ℬ; this gives the equation for conditional probabilities: $$ P({A_1} \cdot {A_2}|B) = P({A_1}|B) \cdot P({A_2}|B) $$ (1.1) for any A1 ∈ A1, A2 ∈A2. We say that the σ-algebra ℬ splits A1 and A2 (or is splitting) if (1.1) holds for A1, ℬ, A2.

264 citations


Journal ArticleDOI
TL;DR: In this article, a regression-type approach is used to fit and test alternative models for rainfall occurrence in Jordan, Niger, Botswana, and Sri Lanka, where the transition probabilities vary with time of year.
Abstract: A range of Markov chain models have been used in the past to describe rainfall occurrence. Gamma distributions are commonly used for modeling rainfall amounts. These are all examples of generalized linear models. This unified view allows a regression-type approach to be used to fit and test alternative models. The approach is illustrated by fitting first- and second-order Markov chains in which the transition probabilities vary with time of year to data from sites in Jordan, Niger, Botswana and Sri Lanka. Gamma distributions with parameters varying with time are also fitted.

206 citations




Journal ArticleDOI
TL;DR: In this article, a first-order Markov chain and an alternating renewal process were compared as models describing the occurrence of sequences of wet and dry days, and the Markov Chain model was superior to the ARP using the minimum Akaike information criterion.
Abstract: A first-order Markov chain and an alternating renewal process (ARP) with a truncated geometric distribution of wet day intervals and a truncated negative binomial distribution of dry day intervals are compared as models describing the occurrence of sequences of wet and dry days. Numerical optimization techniques are used to obtain approximate maximum likelihood estimates of the Fourier coefficients which describe the seasonal variation of the two Markov chain parameters and the three parameters in the alternating renewal process. For the four U.S. stations studied, the Markov chain model was superior to the ARP using the minimum Akaike information criterion.

Journal ArticleDOI
TL;DR: In this paper, a revealing alternate proof is provided for the Iglehart-Borovkov (1967) heavy-traffic limit theorem for GI/G/s queues.
Abstract: A revealing alternate proof is provided for the Iglehart (1965), (1973)–Borovkov (1967) heavy-traffic limit theorem for GI/G/s queues. This kind of heavy traffic is obtained by considering a sequence of GI/G/s systems with the numbers of servers and the arrival rates going to ∞ while the service-time distributions are held fixed. The theorem establishes convergence to a Gaussian process, which in general is not Markov, for an appropriate normalization of the sequence of stochastic processes representing the number of customers in service at arbitrary times. The key idea in the new proof is to consider service-time distributions that are randomly stopped sums of exponential phases, and then work with the discrete-time vector-valued Markov chain representing the number of customers in each phase of service at arrival epochs. It is then easy to show that this sequence of Markov chains converges to a multivariate O–U (Ornstein–Uhlenbeck) diffusion process by applying simple criteria in Stroock and Varadhan (1979). The Iglehart–Borovkov limit for these special service-time distributions is the sum of the components of this multivariate O–U process. Heavy-traffic convergence is also established for the steady-state distributions of GI/M/s queues under the same conditions by exploiting stochastic-order properties.

Book
31 Aug 1982
TL;DR: In this article, the Information Cocycle is used to define block codes for topological Markov chains, and the classifications of topological markov chains are presented, including block codes and finite isomorphisms.
Abstract: 1. Introduction 2. The Information Cocycle 3. Finitary Isomorphisms 4. Block-codes 5. Classifications of Topological Markov Chains.

Journal ArticleDOI
TL;DR: In this paper, the authors show that an irreducible and aperiodic topological Markov chain is topological conjugate to a subsystem of an expansive homeomorphism of the Cantor discontinuum.
Abstract: Let S A be an irreducible and aperiodic topological Markov chain. If S Ā is an irreducible and aperiodic topological Markov chain, whose topological entropy is less than that of S A , then there exists an irreducible and aperiodic topological Markov chain, whose topological entropy equals the topological entropy at S Ā , and that is a subsystem of S A . If S is an expansive homeomorphism of the Cantor discontinuum, whose topological entropy is less than that of S A , and such that for every j ∈ℕ the number of periodic points of least period j of S is less than or equal to the number of periodic points of least period j of S A , then S is topological conjugate to a subsystem of S A .

Journal ArticleDOI
TL;DR: In this article, the authors studied the weak ergodicity of non-homogeneous Markov systems and found that the limiting structure and the relative limiting structure exist under certain conditions.
Abstract: In this paper we study the asymptotic behavior of Markov systems and especially non-homogeneous Markov systems. It is found that the limiting structure and the relative limiting structure exist under certain conditions. The problem of weak ergodicity in the above non-homogeneous systems is studied. Necessary and sufficient conditions are provided for weak ergodicity. Finally, we discuss the application of the present results in manpower systems.

Journal ArticleDOI
TL;DR: In this paper, the authors considered a class of Markov chains on a bivariate state space, whose transition probabilities have a particular "block-partitioned" structure, and they showed that the stationary distribution Π for these chains has an operator-geometric nature, with, where the operator S is the minimal solution of a non-linear operator equation.
Abstract: This paper considers a class of Markov chains on a bivariate state space , whose transition probabilities have a particular ‘block-partitioned' structure. Examples of such chains include those studied by Neuts [8] who took E to be finite; they also include chains studied in queueing theory, such as (Nn , Sn ) where Nn is the number of customers in a GI/G/1 queue immediately before, and Sn the remaining service time immediately after, the nth arrival. We show that the stationary distribution Πfor these chains has an ‘operator-geometric' nature, with , where the operator S is the minimal solution of a non-linear operator equation. Necessary and sufficient conditions for Πto exist are also found. In the case of the GI/G/1 queueing chain above these are exactly the usual stability conditions. G1/G/1 QUEUE; PHASE-TYPE; INVARIANT MEASURE; FOSTER'S CONDITIONS

Journal ArticleDOI
TL;DR: In this article, the authors apply Goodman's (1968) model of quasi-independence and compare it to previously used methods (which they now believe are invalid) in the geological literature, and present tests for homogeneity, a spatial analogue to stationarity, of multiple embedded chains and for symmetry and Markov chain order.
Abstract: Embedded Markov chains may be used to describe rock sequences in which a lithology is not or cannot be observed following itself. Such chains lead to a transition matrix with zeros on the main diagonal. To test the hypothesis of randomness in an embedded Markov chain, we apply Goodman's (1968) model of quasi-independence and compare it to previously used methods (which we now believe are invalid) in the geological literature. Data in the literature show quite different results (depending on the original method) when reanalyzed in this way. We present tests for homogeneity, a spatial analogue to stationarity, of multiple embedded chains and for symmetry and Markov chain order. Matrices from vertical or laterally spaced sequences can be tested for homogeneity and conclusions drawn regar ing variations of processes over space. A normalized difference is proposed as an aid in interpreting the difference between observed transition frequencies and transition probabilities estimated with a model of independence or quasi-independence.

Journal ArticleDOI
TL;DR: In this paper, it is shown that the random measure is supported on a bounded generalized Cantor set and that this set performs a "wandering" but coherent motion that, if appropriately rescaled, approaches a Brownian motion.
Abstract: Fleming and Viot have established the existence of a continuous-state-space version of the Ohta-Kimura ladder or stepwise-mutation model of population genetics for describing allelic frequencies within a selectively neutral population undergoing mutation and random genetic drift. Their model is given by a probability-measure-valued Markov diffusion process. In this paper, we investigate the qualitative behavior of such measure-valued processes. It is demonstrated that the random measure is supported on a bounded generalized Cantor set and that this set performs a "wandering" but "coherent" motion that, if appropriately rescaled, approaches a Brownian motion. The method used involves the construction of an interacting infinite particle system determined by the moment measures of the process and an analysis of the function-valued process that is "dual" to the measure-valued process of Fleming and Viot.

Journal ArticleDOI
TL;DR: In this article, the applicability of generalized inverses to a wide variety of problems in applied probability where a Markov chain is present either directly or indirectly through some form of imbedding is examined.

Journal ArticleDOI
TL;DR: In this article, saddle point strategies for zero-sum Markov games with stopping and impulsive strategies are considered and it is shown that the value of the game depends continuously on the initial state.
Abstract: Three kinds of zero-sum Markov games with stopping and impulsive strategies are considered. For these games we find the saddle point strategies and prove that, the value of the game depends continuously on the initial state.

Journal ArticleDOI
TL;DR: In this article, it was shown that if a Markov chain converges rapidly to stationarity, then the time until the first hit on a rarely-visited set of states is approximately exponentially distributed; moreover an explicit bound for the error in this approximation can be given.

Journal ArticleDOI
TL;DR: A new family of adaptive controllers is exhibited which achieves a performance precisely equalling the optimal performance achievable if the transition probabilities were known instead of the model or dynamics of the system.
Abstract: We consider the problem of adaptively controlling a Markov chain with unknown transition probabilities. A new family of adaptive controllers is exhibited which achieves a performance precisely equalling the optimal performance achievable if the transition probabilities (i.e., the model or dynamics of the system) were known instead. Hence, the adaptive controllers presented here are truly optimal The performance of the system to be controlled is measured by the average of the costs incurred over an infinite operating time period. These adaptive controllers can, potentially, be implemented on digital computers and used in the on-line control of unknown systems.

Journal ArticleDOI
TL;DR: Using the martingale formulation for Markov processes introduced by Stroock and Varadhan, this article developed a criterion for checking if a measure happens to be invariant in a Markov process.
Abstract: Using the martingale formulation for Markov processes introduced by Stroock and Varadhan, we develop a criterion for checking if a measure happens to be invariant.


Journal ArticleDOI
TL;DR: In this article, the authors consider the problem of adaptively controlling an unknown Markov chain, without any prior information regarding the values of the transition probabilities, except for a list of forbidden, zero-probability transitions, which is usually obtained as a byproduct of the modeling process itself.
Abstract: We consider the problem of adaptively controlling an unknown Markov chain. No prior information regarding the values of the transition probabilities is provided us (except for a list of forbidden, zero-probability transitions, which is usually obtained as a byproduct of the modeling process itself). The goal is to design an adaptive controller to adequately control the unknown system when its performance is measured by the average cost incurred over a long operating time period. Our main result is the exhibition of a family of adaptive controllers which, when applied to the unknown system, will result in a performance precisely equal to the optimal performance attainable if the system, i.e., the transition probabilities, were known. Hence, the adaptive controllers proposed here are truly optimal, even when operating on an unknown system. The results presented here extend similar results in [1] where we assume to be initially provided with a finite set of possible models, one of which is guaranteed to be the true one. This paper directly addresses those practical situations where a finite set of possible models with such a guarantee is hard to come by.

Journal ArticleDOI
TL;DR: Global convergence of the algorithm is proven under very weak assumptions and the proof relates this technique to other iterative methods that have been suggested for general linear programs.
Abstract: An iterative aggregation procedure is described for solving large scale, finite state, finite action Markov decision processes MDPs. At each iteration, an aggregate master problem and a sequence of smaller subproblems are solved. The weights used to form the aggregate master problem are based on the estimates from the previous iteration. Each subproblem is a finite state, finite action MDP with a reduced state space and unequal row sums. Global convergence of the algorithm is proven under very weak assumptions. The proof relates this technique to other iterative methods that have been suggested for general linear programs.

Journal ArticleDOI
TL;DR: In this article, it is shown that under appropriate regularity conditions the associated stochastic process describing the state at timet,t≥0, and the stationary distribution are continuous functions of the life-times of the active components.

Journal ArticleDOI
TL;DR: In this article, a class of two-dimensional birth-and-death processes, with applications in many modelling problems, is defined and analyzed in the steady-state, which are processes whose instantaneous transition rates are state-dependent in a restricted way.
Abstract: A class of two-dimensional Birth-and-Death processes, with applications in many modelling problems, is defined and analysed in the steady-state. These are processes whose instantaneous transition rates are state-dependent in a restricted way. Generating functions for the steady-state distribution are obtained by solving a functional equation in two variables. That solution method lends itself readily to numerical implementation. Some aspects of the numerical solution are discussed, using a particular model as an example.

Journal ArticleDOI
TL;DR: The population entropy introduced by Demetrius is shown to have a precise dynamical meaning as a measure of convergence rate to the stable age distribution as well as the first clear biological reason why entropy is a broadly useful population statistic.
Abstract: The population entropy introduced by Demetrius is shown to have a precise dynamical meaning as a measure of convergence rate to the stable age distribution. First the Leslie population model is transformed exactly into a Markov chain on a state space of age-classes. Next the dynamics of convergence from a nonequilibrium state to the stable state are analyzed. The results provide the first clear biological reason why entropy is a broadly useful population statistic.