We address three questions related to public reports of diabetes quality. First, does clinic quality evolve over time? Second, does the quality of reporting clinics converge to a common standard? Third, how persistent are provider quality rankings across time? Since current methods of public reporting rely on historic data, measures of clinic quality are most informative if relative clinic performance is persistent across time. We use data from the Minnesota Community Measurement spanning 2007–2012. We employ seemingly-unrelated regression to measure quality improvement conditional upon cohort effects and changes in quality metrics. Basic autoregressive models are used to measure quality persistence. There were striking differences in initial quality across cohorts of clinics and early-reporting cohorts maintained higher quality in all years. This suggests that consumers can infer, on average, that non-reporting clinics have poorer quality than reporting clinics. Average quality, however, improves slowly in all cohorts and quality dispersion declines over time both within and across cohorts. Relative clinic quality is highly persistent year-to-year, suggesting that publicly-reported measures can inform consumers in choice of clinics, even though they represent measured quality for a previous time period. Finally, definition changes in measures can make it difficult to draw appropriate inferences from longitudinal public reports data.
|Original language||English (US)|
|Number of pages||12|
|Journal||International Journal of Health Economics and Management|
|State||Published - Mar 1 2015|
Bibliographical noteFunding Information:
We gratefully acknowledge the Robert Wood Johnson Foundation for funding this research through the Aligning Forces for Quality Evaluation Project. We also thank Minnesota Community Measurement for the use of their data and comments regarding our research.
© 2015, Springer Science+Business Media New York.
- Longitudinal analysis
- Public reporting
- Quality measurement