Motivated by the Golub–Heath–Wahba formula for ridge regression, we first present a new leave-one-out lemma for the kernel support vector machines (SVM) and related large-margin classifiers. We then use the lemma to design a novel and efficient algorithm, named “magicsvm,” for training the kernel SVM and related large-margin classifiers and computing the exact leave-one-out cross-validation error. By “magicsvm,” the computational cost of leave-one-out analysis is of the same order of fitting a single SVM on the training data. We show that “magicsvm” is much faster than the state-of-the-art SVM solvers based on extensive simulations and benchmark examples. The same idea is also used to boost the computation speed of the V-fold cross-validation of the kernel classifiers.
Bibliographical noteFunding Information:
Zou’s work is supported in part by NSF (grant nos. 1915-842 and 2015-120). We thank to the editor, AE and two referees for their helpful comments and suggestions.
© 2021 American Statistical Association and the American Society for Quality.
- Cross validation
- Kernel learning
- Leave-one-out analysis
- Support vector machines