### Abstract

In this paper we propose a new support vector machine (SVM), the F _{∞}-norm SVM, to perform automatic factor selection in classification. The Foo-norm SVM methodology is motivated by the feature selection problem in cases where the input features are generated by factors, and the model is best interpreted in terms of significant factors. This type of problem arises naturally when a set of dummy variables is used to represent a categorical factor and/or a set of basis functions of a continuous variable is included in the predictor set. In problems without such obvious group information, we propose to first create groups among features by clustering, and then apply the Foo-norm SVM. We show that the Foo-norm SVM is equivalent to a linear programming problem and can be efficiently solved using standard techniques. Analysis on simulated and real data shows that the Foo-norm SVM enjoys competitive performance when compared with the 1-norm and 2-norm SVMs.

Original language | English (US) |
---|---|

Pages (from-to) | 379-398 |

Number of pages | 20 |

Journal | Statistica Sinica |

Volume | 18 |

Issue number | 1 |

State | Published - Jan 1 2008 |

### Keywords

- F penalty
- Factor selection
- Feature selection
- L penalty
- Linear programming
- Support vector machine

## Fingerprint Dive into the research topics of 'The F<sub>∞</sub>-norm support vector machine'. Together they form a unique fingerprint.

## Cite this

_{∞}-norm support vector machine.

*Statistica Sinica*,

*18*(1), 379-398.