Generalized SMO algorithm for SVM-based multitask learning

Research output: Contribution to journalArticlepeer-review

42 Scopus citations

Abstract

Exploiting additional information to improve traditional inductive learning is an active research area in machine learning. In many supervised-learning applications, training data can be naturally separated into several groups, and incorporating this group information into learning may improve generalization. Recently, Vapnik proposed a general approach to formalizing such problems, known as 'learning with structured data' and its support vector machine (SVM) based optimization formulation called SVM+. Liang and Cherkassky showed the connection between SVM+ and multitask learning (MTL) approaches in machine learning, and proposed an SVM-based formulation for MTL called SVM+MTL for classification. Training the SVM+MTL classifier requires the solution of a large quadratic programming optimization problem which scales as O(n3) with sample size n. So there is a need to develop computationally efficient algorithms for implementing SVM+MTL. This brief generalizes Platt's sequential minimal optimization (SMO) algorithm to the SVM+MTL setting. Empirical results show that, for typical SVM+MTL problems, the proposed generalized SMO achieves over 100 times speed-up, in comparison with general-purpose optimization routines.

Original languageEnglish (US)
Article number6183517
Pages (from-to)997-1003
Number of pages7
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume23
Issue number6
DOIs
StatePublished - Dec 1 2012

Keywords

  • Classification
  • SVM+
  • learning with structured data
  • multitask learning
  • quadratic optimization
  • sequential minimal optimization
  • support vector machine (SVM)

Fingerprint Dive into the research topics of 'Generalized SMO algorithm for SVM-based multitask learning'. Together they form a unique fingerprint.

Cite this