scispace - formally typeset
Search or ask a question
Topic

Bayes' theorem

About: Bayes' theorem is a research topic. Over the lifetime, 13158 publications have been published within this topic receiving 563695 citations. The topic is also known as: Bayes theorem & Bayes' rule.


Papers
More filters
Journal ArticleDOI
TL;DR: A range of Bayesian hierarchical models using the Markov chain Monte Carlo software WinBUGS are presented that allow for variation in true treatment effects across trials, and models where the between-trials variance is homogeneous across treatment comparisons are considered.
Abstract: Mixed treatment comparison (MTC) meta-analysis is a generalization of standard pairwise meta-analysis for A vs B trials, to data structures that include, for example, A vs B, B vs C, and A vs C trials. There are two roles for MTC: one is to strengthen inference concerning the relative efficacy of two treatments, by including both 'direct' and 'indirect' comparisons. The other is to facilitate simultaneous inference regarding all treatments, in order for example to select the best treatment. In this paper, we present a range of Bayesian hierarchical models using the Markov chain Monte Carlo software WinBUGS. These are multivariate random effects models that allow for variation in true treatment effects across trials. We consider models where the between-trials variance is homogeneous across treatment comparisons as well as heterogeneous variance models. We also compare models with fixed (unconstrained) baseline study effects with models with random baselines drawn from a common distribution. These models are applied to an illustrative data set and posterior parameter distributions are compared. We discuss model critique and model selection, illustrating the role of Bayesian deviance analysis, and node-based model criticism. The assumptions underlying the MTC models and their parameterization are also discussed.

1,861 citations

Journal ArticleDOI
15 Jan 2004-Nature
TL;DR: This work shows that subjects internally represent both the statistical distribution of the task and their sensory uncertainty, combining them in a manner consistent with a performance-optimizing bayesian process.
Abstract: When we learn a new motor skill, such as playing an approaching tennis ball, both our sensors and the task possess variability. Our sensors provide imperfect information about the ball's velocity, so we can only estimate it. Combining information from multiple modalities can reduce the error in this estimate. On a longer time scale, not all velocities are a priori equally probable, and over the course of a match there will be a probability distribution of velocities. According to bayesian theory, an optimal estimate results from combining information about the distribution of velocities-the prior-with evidence from sensory feedback. As uncertainty increases, when playing in fog or at dusk, the system should increasingly rely on prior knowledge. To use a bayesian strategy, the brain would need to represent the prior distribution and the level of uncertainty in the sensory feedback. Here we control the statistical variations of a new sensorimotor task and manipulate the uncertainty of the sensory feedback. We show that subjects internally represent both the statistical distribution of the task and their sensory uncertainty, combining them in a manner consistent with a performance-optimizing bayesian process. The central nervous system therefore employs probabilistic models during sensorimotor learning.

1,811 citations

Journal ArticleDOI
TL;DR: A Bayesian probabilistic framework for microarray data analysis is developed that derives point estimates for both parameters and hyperparameters, and regularized expressions for the variance of each gene by combining the empirical variance with a local background variance associated with neighboring genes.
Abstract: Motivation: DNA microarrays are now capable of providing genome-wide patterns of gene expression across many different conditions. The first level of analysis of these patterns requires determining whether observed differences in expression are significant or not. Current methods are unsatisfactory due to the lack of a systematic framework that can accommodate noise, variability, and low replication often typical of microarray data. Results: We develop a Bayesian probabilistic framework for microarray data analysis. At the simplest level, we model log-expression values by independent normal distributions, parameterized by corresponding means and variances with hierarchical prior distributions. We derive point estimates for both parameters and hyperparameters, and regularized expressions for the variance of each gene by combining the empirical variance with a local background variance associated with neighboring genes. An additional hyperparameter, inversely related to the number of empirical observations, determines the strength of the background variance. Simulations show that these point estimates, combined with a t-test, provide a systematic inference approach that compares favorably with simple t-test or fold methods, and partly compensate for the lack of replication.

1,763 citations

Journal ArticleDOI
TL;DR: A Bayesian MCMC approach to the analysis of combined data sets was developed and its utility in inferring relationships among gall wasps based on data from morphology and four genes was explored, supporting the utility of morphological data in multigene analyses.
Abstract: The recent development of Bayesian phylogenetic inference using Markov chain Monte Carlo (MCMC) techniques has facilitated the exploration of parameter-rich evolutionary models. At the same time, stochastic models have become more realistic (and complex) and have been extended to new types of data, such as morphology. Based on this foundation, we developed a Bayesian MCMC approach to the analysis of combined data sets and explored its utility in inferring relationships among gall wasps based on data from morphology and four genes (nuclear and mitochondrial, ribosomal and protein coding). Examined models range in complexity from those recognizing only a morphological and a molecular partition to those having complex substitution models with independent parameters for each gene. Bayesian MCMC analysis deals efficiently with complex models: convergence occurs faster and more predictably for complex models, mixing is adequate for all parameters even under very complex models, and the parameter update cycle is virtually unaffected by model partitioning across sites. Morphology contributed only 5% of the characters in the data set but nevertheless influenced the combined-data tree, supporting the utility of morphological data in multigene analyses. We used Bayesian criteria (Bayes factors) to show that process heterogeneity across data partitions is a significant model component, although not as important as among-site rate variation. More complex evolutionary models are associated with more topological uncertainty and less conflict between morphology and molecules. Bayes factors sometimes favor simpler models over considerably more parameter-rich models, but the best model overall is also the most complex and Bayes factors do not support exclusion of apparently weak parameters from this model. Thus, Bayes factors appear to be useful for selecting among complex models, but it is still unclear whether their use strikes a reasonable balance between model complexity and error in parameter estimates.

1,758 citations

Journal ArticleDOI
TL;DR: In this paper, a cyclic metropolis algorithm is used to construct a Markov-chain simulation tool for the analysis of stochastic volatility models in which the logarithm of conditional variance follows an autoregressive model.
Abstract: New techniques for the analysis of stochastic volatility models in which the logarithm of conditional variance follows an autoregressive model are developed. A cyclic Metropolis algorithm is used to construct a Markov-chain simulation tool. Simulations from this Markov chain coverage in distribution to draws from the posterior distribution enabling exact finite-sample inference. The exact solution to the filtering/smoothing problem of inferring about the unobserved variance states is a by-product of our Markov-chain method. In addition, multistep-ahead predictive densities can be constructed that reflect both inherent model variability and parameter uncertainty. We illustrate our method by analyzing both daily and weekly data on stock returns and exchange rates. Sampling experiments are conducted to compare the performance of Bayes estimators to method of moments and quasi-maximum likelihood estimators proposed in the literature. In both parameter estimation and filtering, the Bayes estimators outperform ...

1,711 citations


Network Information
Related Topics (5)
Inference
36.8K papers, 1.3M citations
90% related
Estimator
97.3K papers, 2.6M citations
85% related
Markov chain
51.9K papers, 1.3M citations
85% related
Probability distribution
40.9K papers, 1.1M citations
82% related
Probabilistic logic
56K papers, 1.3M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023483
20221,082
2021508
2020533
2019610
2018607