A simple introduction to Markov Chain Monte-Carlo sampling.
TLDR
This article provides a very basic introduction to MCMC sampling, and describes what MCMC is, and what it can be used for, with simple illustrative examples.Abstract:
Markov Chain Monte–Carlo (MCMC) is an increasingly popular method for obtaining information about distributions, especially for estimating posterior distributions in Bayesian inference. This article provides a very basic introduction to MCMC sampling. It describes what MCMC is, and what it can be used for, with simple illustrative examples. Highlighted are some of the benefits and limitations of MCMC sampling, as well as different approaches to circumventing the limitations most likely to trouble cognitive scientists.read more
Citations
More filters
Journal ArticleDOI
Bayesian inference for psychology. Part I: Theoretical advantages and practical ramifications.
Eric-Jan Wagenmakers,Maarten Marsman,Tahira Jamil,Alexander Ly,Josine Verhagen,Jonathon Love,Ravi Selker,Quentin Frederik Gronau,Martin Šmíra,Sacha Epskamp,Dora Matzke,Jeffrey N. Rouder,Richard D. Morey +12 more
TL;DR: Ten prominent advantages of the Bayesian approach are outlined, and several objections to Bayesian hypothesis testing are countered.
Journal ArticleDOI
Ordinal Regression Models in Psychology: A Tutorial
TL;DR: In psychology, ordinal variables, although extremely common in psychology, are almost exclusively analyzed with statistical models that falsely assume them to be metric as discussed by the authors, which can lead to distorted effect.
Journal ArticleDOI
Uncertainty models for stochastic optimization in renewable energy applications
TL;DR: It is concluded based on the surveyed literatures that the stochastic optimization methods almost always outperform the deterministic optimization methods in terms of social, technical, and economic aspects of renewable energy systems.
Journal ArticleDOI
Introduction to Bayesian Inference for Psychology.
TL;DR: The fundamental tenets of Bayesian inference are introduced, which derive from two basic laws of probability theory, and the interpretation of probabilities, discrete and continuous versions of Bayes’ rule, parameter estimation, and model comparison are covered.
Posted Content
Error mitigation with Clifford quantum-circuit data
TL;DR: A novel, scalable error-mitigation method that applies to gate-based quantum computers and obtains an order-of-magnitude error reduction for a ground-state energy problem on 16 qubits in an IBMQ quantum computer and on a 64-qubit noisy simulator.
References
More filters
Journal ArticleDOI
Inference from Iterative Simulation Using Multiple Sequences
Andrew Gelman,Donald B. Rubin +1 more
TL;DR: The focus is on applied inference for Bayesian posterior distributions in real problems, which often tend toward normal- ity after transformations and marginalization, and the results are derived as normal-theory approximations to exact Bayesian inference, conditional on the observed simulations.
Book
Signal detection theory and psychophysics
David M. Green,John A. Swets +1 more
TL;DR: This book discusses statistical decision theory and sensory processes in signal detection theory and psychophysics and describes how these processes affect decision-making.
BookDOI
Markov Chain Monte Carlo in Practice
TL;DR: The Markov Chain Monte Carlo Implementation Results Summary and Discussion MEDICAL MONITORING Introduction Modelling Medical Monitoring Computing Posterior Distributions Forecasting Model Criticism Illustrative Application Discussion MCMC for NONLINEAR HIERARCHICAL MODELS.
Journal ArticleDOI
A Theory of Memory Retrieval.
TL;DR: A theory of memory retrieval is developed and is shown to apply over a range of experimental paradigms, and it is noted that neural network models can be interfaced to the retrieval theory with little difficulty and that semantic memory models may benefit from such a retrieval scheme.