Bridging AIC and BIC: A New Criterion for Autoregression

Jie Ding, Vahid Tarokh, Yuhong Yang

Research output: Contribution to journalArticle

9 Scopus citations

Abstract

To address order selection for an autoregressive model fitted to time series data, we propose a new information criterion. It has the benefits of the two well-known model selection techniques: The Akaike information criterion and the Bayesian information criterion. When the data are generated from a finite-order autoregression, the Bayesian information criterion is known to be consistent, and so is the new criterion. When the true order is infinity or suitably high with respect to the sample size, the Akaike information criterion is known to be efficient in the sense that its predictive performance is asymptotically equivalent to the best offered by the candidate models; in this case, the new criterion behaves in a similar manner. Different from the two classical criteria, the proposed criterion adaptively achieves either consistency or efficiency depending on the underlying true model. In practice, where the observed time series is given without any prior information about the model specification, the proposed order selection criterion is more flexible and reliable compared with classical approaches. Numerical results are presented, demonstrating the adaptivity of the proposed technique when applied to various data sets.

Original languageEnglish (US)
Pages (from-to)4024-4043
Number of pages20
JournalIEEE Transactions on Information Theory
Volume64
Issue number6
DOIs
StatePublished - Jun 2018

Keywords

  • Adaptivity
  • Akaike information criterion
  • Asymptotic efficiency
  • Bayesian information criterion
  • Bridge criterion
  • Consistency
  • Information criterion
  • Model selection
  • Parametricness index

Fingerprint Dive into the research topics of 'Bridging AIC and BIC: A New Criterion for Autoregression'. Together they form a unique fingerprint.

  • Cite this