scispace - formally typeset
Search or ask a question
Book

Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference

01 Oct 1997-
TL;DR: Model Adequacy Model Choice: MCMC Over Model and Parameter Spaces Convergence Acceleration Exercises Further topics in MCMC are explained.
Abstract: Introduction Stochastic simulation Introduction Generation of Discrete Random Quantities Generation of Continuous Random Quantities Generation of Random Vectors and Matrices Resampling Methods Exercises Bayesian Inference Introduction Bayes' Theorem Conjugate Distributions Hierarchical Models Dynamic Models Spatial Models Model Comparison Exercises Approximate methods of inference Introduction Asymptotic Approximations Approximations by Gaussian Quadrature Monte Carlo Integration Methods Based on Stochastic Simulation Exercises Markov chains Introduction Definition and Transition Probabilities Decomposition of the State Space Stationary Distributions Limiting Theorems Reversible Chains Continuous State Spaces Simulation of a Markov Chain Data Augmentation or Substitution Sampling Exercises Gibbs Sampling Introduction Definition and Properties Implementation and Optimization Convergence Diagnostics Applications MCMC-Based Software for Bayesian Modeling Appendix 5.A: BUGS Code for Example 5.7 Appendix 5.B: BUGS Code for Example 5.8 Exercises Metropolis-Hastings algorithms Introduction Definition and Properties Special Cases Hybrid Algorithms Applications Exercises Further topics in MCMC Introduction Model Adequacy Model Choice: MCMC Over Model and Parameter Spaces Convergence Acceleration Exercises References Author Index Subject Index
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, a fast Markov chain Monte Carlo exploration of cosmological parameter space is presented, which combines data from the CMB, HST Key Project, 2dF galaxy redshift survey, supernovae type Ia and big-bang nucleosynthesis.
Abstract: We present a fast Markov chain Monte Carlo exploration of cosmological parameter space. We perform a joint analysis of results from recent cosmic microwave background ~CMB! experiments and provide parameter constraints, including s 8, from the CMB independent of other data. We next combine data from the CMB, HST Key Project, 2dF galaxy redshift survey, supernovae type Ia and big-bang nucleosynthesis. The Monte Carlo method allows the rapid investigation of a large number of parameters, and we present results from 6 and 9 parameter analyses of flat models, and an 11 parameter analysis of non-flat models. Our results include constraints on the neutrino mass ( mn&0.3 eV), equation of state of the dark energy, and the tensor amplitude, as well as demonstrating the effect of additional parameters on the base parameter constraints. In a series of appendixes we describe the many uses of importance sampling, including computing results from new data and accuracy correction of results generated from an approximate method. We also discuss the different ways of converting parameter samples to parameter constraints, the effect of the prior, assess the goodness of fit and consistency, and describe the use of analytic marginalization over normalization parameters.

3,550 citations

Journal ArticleDOI
TL;DR: Christen et al. as discussed by the authors used a gamma-to-regressive semiparametric model with an arbitrary number of subdivisions along the sediment to estimate the age of sediment cores.
Abstract: Radiocarbon dating is routinely used in paleoecology to build chronolo- gies of lake and peat sediments, aiming at inferring a model that would relate the sediment depth with its age. We present a new approach for chronology building (called \Bacon") that has received enthusiastic attention by paleoecologists. Our methodology is based on controlling core accumulation rates using a gamma au- toregressive semiparametric model with an arbitrary number of subdivisions along the sediment. Using prior knowledge about accumulation rates is crucial and in- formative priors are routinely used. Since many sediment cores are currently ana- lyzed, using difierent data sets and prior distributions, a robust (adaptive) MCMC is very useful. We use the t-walk (Christen and Fox, 2010), a self adjusting, robust MCMC sampling algorithm, that works acceptably well in many situations. Out- liers are also addressed using a recent approach that considers a Student-t model for radiocarbon data. Two examples are presented here, that of a peat core and a core from a lake, and our results are compared with other approaches. Past climates and environments can be reconstructed from deposits such as ocean or lake sediments, ice sheets and peat bogs. Within a vertical sediment proflle (core), mea- surements of microfossils, macrofossils, isotopes and other variables at a range of depths serve as proxy estimates or \proxies" of climate and environmental conditions when the sediment of those depths was deposited. It is crucial to establish reliable relationships between these depths and their ages. Age-depth relationships are used to study the evolution of climate/environmental proxies along sediment depth and therefore through time (e.g., Lowe and Walker 1997). Age-depth models are constructed in various ways. For sediment depths containing organic matter, and for ages younger than c. 50,000 years, radiocarbon dating is often used to create an age-depth model. Cores are divided into slices and some of these are radiocarbon dated. A curve is fltted to the radiocarbon data and interpolated to obtain an age estimate for every depth of the core. The flrst restriction to be considered is that age should be increasing monotonically with depth, because sediment can never have accumulated backwards in time (extraordinary events leading to mixed or reversed sediments are, most of the time, noticeable in the stratigraphy and therefore such cores are ruled out from further analyses). Moreover, cores may have missing sections, leading to ∞at parts in the age depth models.

2,591 citations

Journal ArticleDOI
TL;DR: An excess of B-mode power over the base lensed-ΛCDM expectation is found in the range 30 < ℓ < 150, inconsistent with the null hypothesis at a significance of >5σ, and it is shown that systematic contamination is much smaller than the observed excess.
Abstract: We report results from the BICEP2 experiment, a cosmic microwave background (CMB) polarimeter specifically designed to search for the signal of inflationary gravitational waves in the B -mode power spectrum around l∼80 . The telescope comprised a 26 cm aperture all-cold refracting optical system equipped with a focal plane of 512 antenna coupled transition edge sensor 150 GHz bolometers each with temperature sensitivity of ≈300 μK CMB s √ . BICEP2 observed from the South Pole for three seasons from 2010 to 2012. A low-foreground region of sky with an effective area of 380 square deg was observed to a depth of 87 nK deg in Stokes Q and U . In this paper we describe the observations, data reduction, maps, simulations, and results. We find an excess of B -mode power over the base lensed-ΛCDM expectation in the range 30 5σ . Through jackknife tests and simulations based on detailed calibration measurements we show that systematic contamination is much smaller than the observed excess. Cross correlating against WMAP 23 GHz maps we find that Galactic synchrotron makes a negligible contribution to the observed signal. We also examine a number of available models of polarized dust emission and find that at their default parameter values they predict power ∼(5–10)× smaller than the observed excess signal (with no significant cross-correlation with our maps). However, these models are not sufficiently constrained by external public data to exclude the possibility of dust emission bright enough to explain the entire excess signal. Cross correlating BICEP2 against 100 GHz maps from the BICEP1 experiment, the excess signal is confirmed with 3σ significance and its spectral index is found to be consistent with that of the CMB, disfavoring dust at 1.7σ . The observed B -mode power spectrum is well fit by a lensed-ΛCDM+tensor theoretical model with tensor-to-scalar ratio r=0.20 +0.07 −0.05 , with r=0 disfavored at 7.0σ . Accounting for the contribution of foreground, dust will shift this value downward by an amount which will be better constrained with upcoming data sets

1,954 citations

Journal ArticleDOI
TL;DR: A Bayesian MCMC approach to the analysis of combined data sets was developed and its utility in inferring relationships among gall wasps based on data from morphology and four genes was explored, supporting the utility of morphological data in multigene analyses.
Abstract: The recent development of Bayesian phylogenetic inference using Markov chain Monte Carlo (MCMC) techniques has facilitated the exploration of parameter-rich evolutionary models. At the same time, stochastic models have become more realistic (and complex) and have been extended to new types of data, such as morphology. Based on this foundation, we developed a Bayesian MCMC approach to the analysis of combined data sets and explored its utility in inferring relationships among gall wasps based on data from morphology and four genes (nuclear and mitochondrial, ribosomal and protein coding). Examined models range in complexity from those recognizing only a morphological and a molecular partition to those having complex substitution models with independent parameters for each gene. Bayesian MCMC analysis deals efficiently with complex models: convergence occurs faster and more predictably for complex models, mixing is adequate for all parameters even under very complex models, and the parameter update cycle is virtually unaffected by model partitioning across sites. Morphology contributed only 5% of the characters in the data set but nevertheless influenced the combined-data tree, supporting the utility of morphological data in multigene analyses. We used Bayesian criteria (Bayes factors) to show that process heterogeneity across data partitions is a significant model component, although not as important as among-site rate variation. More complex evolutionary models are associated with more topological uncertainty and less conflict between morphology and molecules. Bayes factors sometimes favor simpler models over considerably more parameter-rich models, but the best model overall is also the most complex and Bayes factors do not support exclusion of apparently weak parameters from this model. Thus, Bayes factors appear to be useful for selecting among complex models, but it is still unclear whether their use strikes a reasonable balance between model complexity and error in parameter estimates.

1,758 citations