Model Selection for K-Nearest Neighbors Regression Using VC Bounds

Vladimir S Cherkassky, Yunqian Ma, Jun Tang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

We discuss an analytic model selection for k-nearest neighbors regression method using VC generalization bounds. Whereas existing implementations of k-nn regression estimate the model complexity as n/k, where n is the number of samples, we propose a new model complexity estimate. The proposed new complexity index used as the VC-dimension in VC bounds yields a new analytic method for model selection. Empirical results for low dimensional and high dimensional data sets indicate that the proposed model selection approach provides accurate model selection that is consistently better than previously used complexity measure. In fact, prediction accuracy of the proposed analytic method is similar to resampling (cross-validation) approach for optimal selection of k.

Original languageEnglish (US)
Title of host publicationProceedings of the International Joint Conference on Neural Networks
Pages1143-1148
Number of pages6
Volume2
StatePublished - Sep 24 2003
EventInternational Joint Conference on Neural Networks 2003 - Portland, OR, United States
Duration: Jul 20 2003Jul 24 2003

Other

OtherInternational Joint Conference on Neural Networks 2003
Country/TerritoryUnited States
CityPortland, OR
Period7/20/037/24/03

Fingerprint

Dive into the research topics of 'Model Selection for K-Nearest Neighbors Regression Using VC Bounds'. Together they form a unique fingerprint.

Cite this