An improved 1-norm SVM for simultaneous classification and variable selection

Research output: Contribution to journalConference articlepeer-review

36 Scopus citations


We propose a novel extension of the 1-norm support vector machine (SVM) for simultaneous feature selection and classification. The new algorithm penalizes the empirical hinge loss by the adaptively weighted 1-norm penalty in which the weights are computed by the 2-norm SVM. Hence the new algorithm is called the hybrid SVM. Simulation and real data examples show that the hybrid SVM not only often improves upon the 1-norm SVM in terms of classification accuracy but also enjoys better feature selection performance.

Original languageEnglish (US)
Pages (from-to)675-681
Number of pages7
JournalJournal of Machine Learning Research
StatePublished - 2007
Event11th International Conference on Artificial Intelligence and Statistics, AISTATS 2007 - San Juan, Puerto Rico
Duration: Mar 21 2007Mar 24 2007


Dive into the research topics of 'An improved 1-norm SVM for simultaneous classification and variable selection'. Together they form a unique fingerprint.

Cite this