Abstract
In binary classification, margin-based techniques usually deliver high performance. As a result, a multicategory problem is often treated as a sequence of binary classifications. In the absence of a dominating class, this treatment may be suboptimal and may yield poor performance, such as for support vector machines (SVMs). We propose a novel multicategory generalization of ψ-learning that treats all classes simultaneously. The new generalization eliminates this potential problem while at the same time retaining the desirable properties of its binary counterpart. We develop a statistical learning theory for the proposed methodology and obtain fast convergence rates for both linear and nonlinear learning examples. We demonstrate the operational characteristics of this method through a simulation. Our results indicate that the proposed methodology can deliver accurate class prediction and is more robust against extreme observations than its SVM counterpart.
Original language | English (US) |
---|---|
Pages (from-to) | 500-509 |
Number of pages | 10 |
Journal | Journal of the American Statistical Association |
Volume | 101 |
Issue number | 474 |
DOIs | |
State | Published - Jun 2006 |
Bibliographical note
Funding Information:Yufeng Liu is Assistant Professor, Department of Statistics and Operations Research, Carolina Center for Genome Sciences, University of North Carolina, Chapel Hill, NC 27599 (E-mail: [email protected]). Xiaotong Shen is Professor, School of Statistics, University of Minnesota, Minneapolis, MN 55455 (E-mail: [email protected]). This research was supported in part by National Science Foundation grants IIS-0328802 and DMS-00-72635. The authors thank the editor, the associate editor, two anonymous referees, and Professor George Fisherman for their helpful comments and suggestions.
Copyright:
Copyright 2011 Elsevier B.V., All rights reserved.
Keywords
- Generalization error
- Nonconvex minimization
- Supervised learning
- Support vectors