Abstract
Statistical learning theory (aka Vapnik-Chervonenkis or VC theory) is a general mathematical framework for estimating dependencies from the empirical data. Recent interest in VC-theory has been motivated by the practical applications of a new constructive learning methodology called support vector machines (SVM's) originating from VC theory. This special issue illustrates the growing importance of VC theory for the field of predictive learning from data.
Original language | English (US) |
---|---|
Pages (from-to) | 985-987 |
Number of pages | 3 |
Journal | IEEE Transactions on Neural Networks |
Volume | 10 |
Issue number | 5 |
DOIs | |
State | Published - 1999 |