TY - JOUR
T1 - Identifiability and convergence issues for Markov chain Monte Carlo fitting of spatial models
AU - Eberly, Lynn E
AU - Carlin, Brad
PY - 2000/9/15
Y1 - 2000/9/15
N2 - The marked increase in popularity of Bayesian methods in statistical practice over the last decade owes much to the simultaneous development of Markov chain Monte Carlo (MCMC) methods for the evaluation of requisite posterior distributions. However, along with this increase in computing power has come the temptation to fit models larger than the data can readily support, meaning that often the propriety of the posterior distributions for certain parameters depends on the propriety of the associated prior distributions. An important example arises in spatial modelling, wherein separate random effects for capturing unstructured heterogeneity and spatial clustering are of substantive interest, even though only their sum is well identified by the data. Increasing the informative content of the associated prior distributions offers an obvious remedy, but one that hampers parameter interpretability and may also significantly slow the convergence of the MCMC algorithm. In this paper we investigate the relationship among identifiability, Bayesian learning and MCMC convergence rates for a common class of spatial models, in order to provide guidance for piror selection and algorithm tuning. We are able to elucidate the key issues with relatively simple examples, and also illustrate the varying impacts of covariates, outliers and algorithm starting values on the resulting algorithms and posterior distributions. Copyright (C) 2000 John Wiley and Sons, Ltd.
AB - The marked increase in popularity of Bayesian methods in statistical practice over the last decade owes much to the simultaneous development of Markov chain Monte Carlo (MCMC) methods for the evaluation of requisite posterior distributions. However, along with this increase in computing power has come the temptation to fit models larger than the data can readily support, meaning that often the propriety of the posterior distributions for certain parameters depends on the propriety of the associated prior distributions. An important example arises in spatial modelling, wherein separate random effects for capturing unstructured heterogeneity and spatial clustering are of substantive interest, even though only their sum is well identified by the data. Increasing the informative content of the associated prior distributions offers an obvious remedy, but one that hampers parameter interpretability and may also significantly slow the convergence of the MCMC algorithm. In this paper we investigate the relationship among identifiability, Bayesian learning and MCMC convergence rates for a common class of spatial models, in order to provide guidance for piror selection and algorithm tuning. We are able to elucidate the key issues with relatively simple examples, and also illustrate the varying impacts of covariates, outliers and algorithm starting values on the resulting algorithms and posterior distributions. Copyright (C) 2000 John Wiley and Sons, Ltd.
UR - http://www.scopus.com/inward/record.url?scp=0034666162&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0034666162&partnerID=8YFLogxK
U2 - 10.1002/1097-0258(20000915/30)19:17/18<2279::AID-SIM569>3.0.CO;2-R
DO - 10.1002/1097-0258(20000915/30)19:17/18<2279::AID-SIM569>3.0.CO;2-R
M3 - Article
C2 - 10960853
AN - SCOPUS:0034666162
SN - 0277-6715
VL - 19
SP - 2279
EP - 2294
JO - Statistics in Medicine
JF - Statistics in Medicine
IS - 17-18
ER -