Abstract
We propose a novel extension of the 1-norm support vector machine (SVM) for simultaneous feature selection and classification. The new algorithm penalizes the empirical hinge loss by the adaptively weighted 1-norm penalty in which the weights are computed by the 2-norm SVM. Hence the new algorithm is called the hybrid SVM. Simulation and real data examples show that the hybrid SVM not only often improves upon the 1-norm SVM in terms of classification accuracy but also enjoys better feature selection performance.
| Original language | English (US) |
|---|---|
| Pages (from-to) | 675-681 |
| Number of pages | 7 |
| Journal | Journal of Machine Learning Research |
| Volume | 2 |
| State | Published - 2007 |
| Event | 11th International Conference on Artificial Intelligence and Statistics, AISTATS 2007 - San Juan, Puerto Rico Duration: Mar 21 2007 → Mar 24 2007 |
Fingerprint
Dive into the research topics of 'An improved 1-norm SVM for simultaneous classification and variable selection'. Together they form a unique fingerprint.Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS