scispace - formally typeset

Markov model

About: Markov model is a(n) research topic. Over the lifetime, 19227 publication(s) have been published within this topic receiving 618193 citation(s). more


Journal ArticleDOI: 10.1109/5.18626
Lawrence R. Rabiner1Institutions (1)
01 Feb 1989-
Abstract: This tutorial provides an overview of the basic theory of hidden Markov models (HMMs) as originated by L.E. Baum and T. Petrie (1966) and gives practical details on methods of implementation of the theory along with a description of selected applications of the theory to distinct problems in speech recognition. Results from a number of original sources are combined to provide a single source of acquiring the background required to pursue further this area of research. The author first reviews the theory of discrete Markov chains and shows how the concept of hidden states, where the observation is a probabilistic function of the state, can be used effectively. The theory is illustrated with two simple examples, namely coin-tossing, and the classic balls-in-urns system. Three fundamental problems of HMMs are noted and several practical techniques for solving these problems are given. The various types of HMMs that have been studied, including ergodic as well as left-right models, are described. > more

20,894 Citations

Open accessProceedings Article
28 Jun 2001-
Abstract: We present conditional random fields , a framework for building probabilistic models to segment and label sequence data. Conditional random fields offer several advantages over hidden Markov models and stochastic grammars for such tasks, including the ability to relax strong independence assumptions made in those models. Conditional random fields also avoid a fundamental limitation of maximum entropy Markov models (MEMMs) and other discriminative Markov models based on directed graphical models, which can be biased towards states with few successor states. We present iterative parameter estimation algorithms for conditional random fields and compare the performance of the resulting models to HMMs and MEMMs on synthetic and natural-language data. more

Topics: Variable-order Markov model (72%), Maximum-entropy Markov model (67%), Graphical model (65%) more

12,343 Citations

Open accessBook
15 Apr 1994-
Abstract: From the Publisher: The past decade has seen considerable theoretical and applied research on Markov decision processes, as well as the growing use of these models in ecology, economics, communications engineering, and other fields where outcomes are uncertain and sequential decision-making processes are needed. A timely response to this increased activity, Martin L. Puterman's new work provides a uniquely up-to-date, unified, and rigorous treatment of the theoretical, computational, and applied research on Markov decision process models. It discusses all major research directions in the field, highlights many significant applications of Markov decision processes models, and explores numerous important topics that have previously been neglected or given cursory coverage in the literature. Markov Decision Processes focuses primarily on infinite horizon discrete time models and models with discrete time spaces while also examining models with arbitrary state spaces, finite horizon models, and continuous-time discrete state models. The book is organized around optimality criteria, using a common framework centered on the optimality (Bellman) equation for presenting results. The results are presented in a "theorem-proof" format and elaborated on through both discussion and examples, including results that are not available in any other book. A two-state Markov decision process model, presented in Chapter 3, is analyzed repeatedly throughout the book and demonstrates many results and algorithms. Markov Decision Processes covers recent research advances in such areas as countable state space models with average reward criterion, constrained models, and models with risk sensitive optimality criteria. It also explores several topics that have received little or no attention in other books, including modified policy iteration, multichain models with average reward criterion, and sensitive optimality. In addition, a Bibliographic Remarks section in each chapter comments on relevant historic more

11,593 Citations

Journal ArticleDOI: 10.1093/BIOMET/82.4.711
01 Dec 1995-Biometrika
Abstract: Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some fixed standard underlying measure. They have therefore not been available for application to Bayesian model determination, where the dimensionality of the parameter vector is typically not fixed. This paper proposes a new framework for the construction of reversible Markov chain samplers that jump between parameter subspaces of differing dimensionality, which is flexible and entirely constructive. It should therefore have wide applicability in model determination problems. The methodology is illustrated with applications to multiple change-point analysis in one and two dimensions, and to a Bayesian comparison of binomial experiments. more

5,817 Citations

Journal ArticleDOI: 10.1109/PROC.1973.9030
Jr. G.D. Forney1Institutions (1)
01 Mar 1973-
Abstract: The Viterbi algorithm (VA) is a recursive optimal solution to the problem of estimating the state sequence of a discrete-time finite-state Markov process observed in memoryless noise. Many problems in areas such as digital communications can be cast in this form. This paper gives a tutorial exposition of the algorithm and of how it is implemented and analyzed. Applications to date are reviewed. Increasing use of the algorithm in a widening variety of areas is foreseen. more

Topics: Soft output Viterbi algorithm (71%), Iterative Viterbi decoding (69%), Viterbi algorithm (69%) more

5,685 Citations

No. of papers in the topic in previous years

Top Attributes

Show by:

Topic's top 5 most impactful authors

Vikram Krishnamurthy

65 papers, 1.7K citations

Robert J. Elliott

49 papers, 2.1K citations

Kishor S. Trivedi

38 papers, 1.7K citations

Holger Hermanns

24 papers, 1.6K citations

Wojciech Pieczynski

24 papers, 878 citations

Network Information
Related Topics (5)
Markov chain

51.9K papers, 1.3M citations

93% related
Markov process

29.7K papers, 738.2K citations

90% related
Estimation theory

35.3K papers, 1M citations

88% related
Stochastic process

31.2K papers, 898.7K citations

88% related
Gaussian process

18.9K papers, 486.6K citations

88% related