Modeling temporal gradients in regionally aggregated California asthma hospitalization data

Harrison Quick, Sudipto Banerjee, Brad Carlin

Research output: Contribution to journalArticlepeer-review

23 Scopus citations


Advances in Geographical Information Systems (GIS) have led to the enormous recent burgeoning of spatial-temporal databases and associated statistical modeling. Here we depart from the rather rich literature in space-time modeling by considering the setting where space is discrete (e.g., aggregated data over regions), but time is continuous. Our major objective in this application is to carry out inference on gradients of a temporal process in our data set of monthly county level asthma hospitalization rates in the state of California, while at the same time accounting for spatial similarities of the temporal process across neighboring counties. Use of continuous time models here allows inference at a finer resolution than at which the data are sampled. Rather than use parametric forms to model time, we opt for a more flexible stochastic process embedded within a dynamic Markov random field framework. Through the matrix-valued covariance function we can ensure that the temporal process realizations are mean square differentiable, and may thus carry out inference on temporal gradients in a posterior predictive fashion. We use this approach to evaluate temporal gradients where we are concerned with temporal changes in the residual and fitted rate curves after accounting for seasonality, spatiotemporal ozone levels and several spatially-resolved important sociodemographic covariates.

Original languageEnglish (US)
Pages (from-to)154-176
Number of pages23
JournalAnnals of Applied Statistics
Issue number1
StatePublished - Mar 2013


  • Gaussian process
  • Gradients
  • Markov chain Monte Carlo
  • Spatial process models
  • Spatially associated functional data


Dive into the research topics of 'Modeling temporal gradients in regionally aggregated California asthma hospitalization data'. Together they form a unique fingerprint.

Cite this