Convergence in Variance of Chebyshev Accelerated Gibbs Samplers
Summary (1 min read)
1. Introduction. Iterations of the form (1.1)
- What makes this correspondence important is that the convergence properties of the solver are inherited by the sampler (and vice versa), which means that acceleration techniques developed for the solver may be applied to the sampler.
- The main purpose of this paper is to establish the equivalence of convergence in mean and covariance in the case of Chebyshev polynomial acceleration, without the assumption of the target distribution being Gaussian.
12). Along the horizontal axis are values of the ratio of the extreme eigenvalues of M
- By recursion the authors prove the following statement.
- Redistribution subject to SIAM license or copyright; see http://www.siam.org/journals/ojsa.php.
- Axelsson points out [1, Rem. 5.11] two deficiencies of the first-order Chebyshev iterative method as a solver:.
- First, the number of steps p needs to be selected in advance, with the method not being optimal for any other number of steps.
- The solution for iterative solvers, and hence for iterative samplers, is to develop the second-order methods, which have neither of these deficiencies.
5. Numerical examples sampling from Gaussians at different resolutions.
- The following examples both use a cubic-element discretization of the cubic domain [0, 1] 3 , with trilinear interpolation from nodal values within each element.
- The examples also both use R = 1/4, though they differ in the number of nodes (or elements) in each coordinate direction.
6. Discussion.
- While performing this research the authors have recognized the debt they othey to (the late) Gene Golub, who pioneered first-and second-order Chebyshev acceleration for linear solvers [9] , which they have built upon.
- The authors are pleased to demonstrate the connection between Gene's work and the sampling algorithms from statistics by publishing in this journal that Gene had wanted to remain titled the Journal on Scientific and Statistical Computing [22] .
Did you find this useful? Give us your feedback
Citations
124 citations
63 citations
21 citations
Cites background or methods from "Convergence in Variance of Chebyshe..."
...The Chebyshev accelerated SSOR solver and corresponding Chebyshev accelerated SSOR sampler (Fox and Parker, 2014) are depicted in panels C and D of Figure 1....
[...]
...But even sooner, after k∗∗ = k∗/2 iterations, the Chebyshev error reduction for the variance is predicted to be smaller than ε (Fox and Parker [19])....
[...]
...Using Theorem 5, we derived the Chebyshev accelerated SSOR sampler (Fox and Parker, 2014) by iteratively updating parameters via (13) and then generating a sampler via (17)....
[...]
...But even sooner, after k∗∗ = k∗/2 iterations, the Chebyshev error reduction for the variance is predicted to be smaller than ε (Fox and Parker, 2014). imsart-bj ver....
[...]
...Fox and Parker (2014) considered point-wise convergence of the mean and variance of a Gibbs SSOR sampler accelerated by Chebyshev polynomials....
[...]
19 citations
17 citations
References
448 citations
297 citations
"Convergence in Variance of Chebyshe..." refers background in this paper
...One of the motivations for undertaking this work was to understand the relationship between stochastic relaxation, as Geman and Geman labeled Gibbs sampling [8], and (classical) relaxation, which is the term Southwell used for early stationary iterative solvers [24]....
[...]
275 citations
"Convergence in Variance of Chebyshe..." refers methods in this paper
...For example, they occur in the stationary linear iterative methods used to solve systems of linear equations [1, 10, 17, 23]....
[...]