Structured Markov Chain Monte Carlo

Daniel J. Sargent, James S Hodges, Bradley P. Carlin

Research output: Contribution to journalArticlepeer-review

34 Scopus citations

Abstract

This article introduces a general method for Bayesian computing in richly parameterized models, structured Markov chain Monte Carlo (SMCMC), that is based on a blocked hybrid of the Gibbs sampling and Metropolis—Hastings algorithms. SMCMC speeds algorithm convergence by using the structure that is present in the problem to suggest an appropriate Metropolis—Hastings candidate distribution. Although the approach is easiest to describe for hierarchical normal linear models, we show that its extension to both nonnormal and nonlinear cases is straightforward. After describing the method in detail we compare its performance (in terms of run time and autocorrelation in the samples) to other existing methods, including the single-site updating Gibbs sampler available in the popular BUGS software package. Our results suggest significant improvements in convergence for many problems using SMCMC, as well as broad applicability of the method, including previously intractable hierarchical nonlinear model settings.

Original languageEnglish (US)
Pages (from-to)217-234
Number of pages18
JournalJournal of Computational and Graphical Statistics
Volume9
Issue number2
DOIs
StatePublished - Jun 2000

Keywords

  • Blocking
  • Convergence acceleration
  • Gibbs sampling
  • Hierarchical model
  • Metropolis-Hastings algorithm

Fingerprint

Dive into the research topics of 'Structured Markov Chain Monte Carlo'. Together they form a unique fingerprint.

Cite this