Fast and Exact Leave-One-Out Analysis of Large-Margin Classifiers

Boxiang Wang, Hui Zou

Research output: Contribution to journalArticlepeer-review

6 Scopus citations


Motivated by the Golub–Heath–Wahba formula for ridge regression, we first present a new leave-one-out lemma for the kernel support vector machines (SVM) and related large-margin classifiers. We then use the lemma to design a novel and efficient algorithm, named “magicsvm,” for training the kernel SVM and related large-margin classifiers and computing the exact leave-one-out cross-validation error. By “magicsvm,” the computational cost of leave-one-out analysis is of the same order of fitting a single SVM on the training data. We show that “magicsvm” is much faster than the state-of-the-art SVM solvers based on extensive simulations and benchmark examples. The same idea is also used to boost the computation speed of the V-fold cross-validation of the kernel classifiers.

Original languageEnglish (US)
Pages (from-to)291-298
Number of pages8
Issue number3
StatePublished - Aug 17 2021

Bibliographical note

Funding Information:
Zou’s work is supported in part by NSF (grant nos. 1915-842 and 2015-120). We thank to the editor, AE and two referees for their helpful comments and suggestions.

Publisher Copyright:
© 2021 American Statistical Association and the American Society for Quality.


  • Cross validation
  • Kernel learning
  • Leave-one-out analysis
  • Support vector machines


Dive into the research topics of 'Fast and Exact Leave-One-Out Analysis of Large-Margin Classifiers'. Together they form a unique fingerprint.

Cite this