Vapnik-Chervonenkis (VC) learning theory and its applications

V. Cherkassky, F. Mulier

Research output: Contribution to journalArticlepeer-review

68 Scopus citations


Statistical learning theory (aka Vapnik-Chervonenkis or VC theory) is a general mathematical framework for estimating dependencies from the empirical data. Recent interest in VC-theory has been motivated by the practical applications of a new constructive learning methodology called support vector machines (SVM's) originating from VC theory. This special issue illustrates the growing importance of VC theory for the field of predictive learning from data.

Original languageEnglish (US)
Pages (from-to)985-987
Number of pages3
JournalIEEE Transactions on Neural Networks
Issue number5
StatePublished - 1999


Dive into the research topics of 'Vapnik-Chervonenkis (VC) learning theory and its applications'. Together they form a unique fingerprint.

Cite this