Abstract
This article introduces a general method for Bayesian computing in richly parameterized models, structured Markov chain Monte Carlo (SMCMC), that is based on a blocked hybrid of the Gibbs sampling and Metropolis—Hastings algorithms. SMCMC speeds algorithm convergence by using the structure that is present in the problem to suggest an appropriate Metropolis—Hastings candidate distribution. Although the approach is easiest to describe for hierarchical normal linear models, we show that its extension to both nonnormal and nonlinear cases is straightforward. After describing the method in detail we compare its performance (in terms of run time and autocorrelation in the samples) to other existing methods, including the single-site updating Gibbs sampler available in the popular BUGS software package. Our results suggest significant improvements in convergence for many problems using SMCMC, as well as broad applicability of the method, including previously intractable hierarchical nonlinear model settings.
Original language | English (US) |
---|---|
Pages (from-to) | 217-234 |
Number of pages | 18 |
Journal | Journal of Computational and Graphical Statistics |
Volume | 9 |
Issue number | 2 |
DOIs | |
State | Published - Jun 2000 |
Keywords
- Blocking
- Convergence acceleration
- Gibbs sampling
- Hierarchical model
- Metropolis-Hastings algorithm