Distance weighted discrimination (DWD) is an interesting large margin classifier that has been shown to enjoy nice properties and empirical successes. The original DWD only handles binary classification with a linear classification boundary. Multiclass classification problems naturally appear in various fields, such as speech recognition, satellite imagery classification, and self-driving vehicles, to name a few. For such complex classification problems, it is desirable to have a flexible multicategory kernel extension of the binary DWD when the optimal decision boundary is highly nonlinear. To this end, we propose a new multicategory kernel DWD, that is, defined as a margin-vector optimization problem in a reproducing kernel Hilbert space. This formulation is shown to enjoy Fisher consistency. We develop an accelerated projected gradient descent algorithm to fit the multicategory kernel DWD. Simulations and benchmark data applications are used to demonstrate the highly competitive performance of our method, as compared with some popular state-of-the-art multiclass classifiers.
Bibliographical noteFunding Information:
Zou’s research was partially supported by National Science Foundation grant DMS-1505111.
- Distance weighted discrimination
- Fisher consistency
- Multicategory classification
- Nesterov’s acceleration
- Projected gradient descent
- Reproducing kernel Hilbert space