Probabilistic joint feature selection for multi-task learning

Tao Xiong, Jinbo Bit, Bharat Rao, Vladimir S Cherkassky

Research output: Chapter in Book/Report/Conference proceedingConference contribution

13 Scopus citations


We study the joint feature selection problem when learning multiple related classification or regression tasks. By imposing an automatic relevance determination prior on the hypothesis classes associated with each of the tasks and regularizing the variance of the hypothesis parameters, similar feature patterns across different tasks are encouraged and features that are relevant to all (or most) of the tasks are identified. Our analysis shows that the proposed probabilistic framework can be seen as a generalization of previous result from adaptive ridge regression to the multi-task learning setting. We provide a detailed description of the proposed algorithms for simultaneous model construction and justify the proposed algorithms in several aspects. Our experimental results show that this approach outperforms a regularized multi-task learning approach and the traditional methods where individual tasks are solved independently on synthetic data and the real-world data sets for lung cancer prognosis.

Original languageEnglish (US)
Title of host publicationProceedings of the 7th SIAM International Conference on Data Mining
PublisherSociety for Industrial and Applied Mathematics Publications
Number of pages11
ISBN (Print)9780898716306
StatePublished - 2007
Event7th SIAM International Conference on Data Mining - Minneapolis, MN, United States
Duration: Apr 26 2007Apr 28 2007

Publication series

NameProceedings of the 7th SIAM International Conference on Data Mining


Other7th SIAM International Conference on Data Mining
Country/TerritoryUnited States
CityMinneapolis, MN


Dive into the research topics of 'Probabilistic joint feature selection for multi-task learning'. Together they form a unique fingerprint.

Cite this