Minimax nonparametric classification - Part II: Model selection for adaptation

Yuhong Yang

Research output: Contribution to journalArticlepeer-review

12 Scopus citations

Abstract

We study nonparametric estimation of a conditional probability for classification based on a collection of finite-dimensional models. For the sake of flexibility, different types of models, linear or nonlinear, are allowed as long as each satisfies a dimensionality assumption. We show that with a suitable model selection criterion, the penalized maximum-likelihood estimator has risk bounded by an index of resolvability expressing a good tradeoff among approximation error, estimation error, and model complexity. The bound does not require any assumption on the target conditional probability and can be used to demonstrate the adaptivity of estimators based on model selection. Examples are given with both splines and neural nets, and problems of high-dimensional estimation are considered. The resulting adaptive estimator is shown to behave optimally or near optimally over Sobolev classes (with unknown orders of interaction and smoothness) and classes of integrable Fourier transform of gradient. In terms of rates of convergence, performance is the same as if one knew which of them contains the true conditional probability in advance. The corresponding classifier also converges optimally or nearly optimally simultaneously over these classes.

Original languageEnglish (US)
Pages (from-to)2285-2292
Number of pages8
JournalIEEE Transactions on Information Theory
Volume45
Issue number7
DOIs
StatePublished - 1999

Fingerprint

Dive into the research topics of 'Minimax nonparametric classification - Part II: Model selection for adaptation'. Together they form a unique fingerprint.

Cite this