Abstract
We study the convergence properties of a collapsed Gibbs sampler for Bayesian vector autoregressions with predictors, or exogenous vari-ables. The Markov chain generated by our algorithm is shown to be geo-metrically ergodic regardless of whether the number of observations in the underlying vector autoregression is small or large in comparison to the or-der and dimension of it. In a convergence complexity analysis, we also give conditions for when the geometric ergodicity is asymptotically stable as the number of observations tends to infinity. Specifically, the geometric convergence rate is shown to be bounded away from unity asymptotically, either almost surely or with probability tending to one, depending on what is as-sumed about the data generating process. This result is one of the first of its kind for practically relevant Markov chain Monte Carlo algorithms. Our convergence results hold under close to arbitrary model misspecification.
Original language | English (US) |
---|---|
Pages (from-to) | 691-721 |
Number of pages | 31 |
Journal | Electronic Journal of Statistics |
Volume | 15 |
Issue number | 1 |
DOIs | |
State | Published - Jan 1 2021 |
Bibliographical note
Funding Information:∗Substantial parts of the work was done while the author was at the University of Minnesota and Vienna University of Technology (TU Wien), and was partially support by FWF (Austrian Science Fund, https://www.fwf.ac.at/en/) [P30690-N35].
Publisher Copyright:
© 2021, Institute of Mathematical Statistics. All rights reserved.
Keywords
- Bayesian vector autoregression
- Convergence complexity analysis
- Geometric er-godicity
- Gibbs sampler
- Markov chain Monte Carlo