Topic
Variable-order Bayesian network
About: Variable-order Bayesian network is a research topic. Over the lifetime, 5450 publications have been published within this topic receiving 265828 citations.
Papers published on a yearly basis
Papers
More filters
•
01 Dec 2010
TL;DR: This book is a comprehensive treatment of inference for hidden Markov models, including both algorithms and statistical theory, and builds on recent developments to present a self-contained view.
Abstract: This book is a comprehensive treatment of inference for hidden Markov models, including both algorithms and statistical theory. Topics range from filtering and smoothing of the hidden Markov chain to parameter estimation, Bayesian methods and estimation of the number of states. In a unified way the book covers both models with finite state spaces and models with continuous state spaces (also called state-space models) requiring approximate simulation-based algorithms that are also described in detail. Many examples illustrate the algorithms and theory. This book builds on recent developments to present a self-contained view.
1,537 citations
••
TL;DR: The use of the Gibbs sampler for Bayesian computation is reviewed and illustrated in the context of some canonical examples as discussed by the authors, and comments are made on the advantages of sample-based approaches for inference summaries.
Abstract: The use of the Gibbs sampler for Bayesian computation is reviewed and illustrated in the context of some canonical examples Other Markov chain Monte Carlo simulation methods are also briefly described, and comments are made on the advantages of sample-based approaches for Bayesian inference summaries
1,422 citations
•
31 Aug 1989TL;DR: This new fourth edition of Peter Lees book looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo, providing a concise account of the way in which the Bayesian approach to statistics develops as well as how it contrasts with the conventional approach.
Abstract: Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lees book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques.This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as well as how it contrasts with the conventional approach. The theory is built up step by step, and important notions such as sufficiency are brought out of a discussion of the salient features of specific examples.This edition:Includes expanded coverage of Gibbs sampling, including more numerical examples and treatments of OpenBUGS, R2WinBUGS and R2OpenBUGS.Presents significant new material on recent techniques such as Bayesian importance sampling, variational Bayes, Approximate Bayesian Computation (ABC) and Reversible Jump Markov Chain Monte Carlo (RJMCMC).Provides extensive examples throughout the book to complement the theory presented.Accompanied by a supporting website featuring new material and solutions.More and more students are realizing that they need to learn Bayesian statistics to meet their academic and professional goals. This book is best suited for use as a main text in courses on Bayesian statistics for third and fourth year undergraduates and postgraduate students.
1,407 citations
••
05 Jul 2008TL;DR: This paper presents a fully Bayesian treatment of the Probabilistic Matrix Factorization (PMF) model in which model capacity is controlled automatically by integrating over all model parameters and hyperparameters and shows that Bayesian PMF models can be efficiently trained using Markov chain Monte Carlo methods by applying them to the Netflix dataset.
Abstract: Low-rank matrix approximation methods provide one of the simplest and most effective approaches to collaborative filtering. Such models are usually fitted to data by finding a MAP estimate of the model parameters, a procedure that can be performed efficiently even on very large datasets. However, unless the regularization parameters are tuned carefully, this approach is prone to overfitting because it finds a single point estimate of the parameters. In this paper we present a fully Bayesian treatment of the Probabilistic Matrix Factorization (PMF) model in which model capacity is controlled automatically by integrating over all model parameters and hyperparameters. We show that Bayesian PMF models can be efficiently trained using Markov chain Monte Carlo methods by applying them to the Netflix dataset, which consists of over 100 million movie ratings. The resulting models achieve significantly higher prediction accuracy than PMF models trained using MAP estimation.
1,394 citations
•
01 May 1994TL;DR: Modelling and Analysis of Cross-sectional Data: A Review of Univariate Generalized Linear Models and Models for Multicategorical Responses and Semi- and Nonparametric Approaches to Regression Analysis.
Abstract: Introduction * Modelling and Analysis of Cross-sectional Data: A Review of Univariate Generalized Linear Models * Models for Multicategorical Responses: Multivariate Extensions of Generalized Linear Models * Selecting and Checking Models * Semi- and Nonparametric Approaches to Regression Analysis * Fixed Parameter Models for Time Series and Longitual Data * Random Effects Models * State Space and Hidden Markov Models * Survival Models
1,355 citations