Predictive learning with sparse heterogeneous data

Vladimir S Cherkassky, Feng Cai, Lichen Liang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations


Many applications of machine learning involve sparse and heterogeneous data. For example, estimation of predictive (diagnostic) models using patients' data from clinical studies requires effective integration of genetic, clinical and demographic data. Typically all heterogeneous inputs are properly encoded and mapped onto a single feature vector, used for estimating (training) a predictive model. This approach, known as standard inductive learning, is used in most application studies. More recently, several new learning methodologies have emerged. In particular, when training data can be naturally separated into several groups (or structured), we can view learning (estimation) for each group as a separate task, leading to Multi-Task Learning framework. Similarly, a setting where training data is structured, but the objective is to estimate a single predictive model (for all groups), leads to Learning with Structured Data and SVM+ methodology recently proposed by Vapnik. This paper demonstrates advantages and limitations of these new data modeling approaches for modeling heterogeneous data (relative to standard inductive SVM) via empirical comparisons using several publicly available medical data sets.

Original languageEnglish (US)
Title of host publication2009 International Joint Conference on Neural Networks, IJCNN 2009
Number of pages8
StatePublished - 2009
Event2009 International Joint Conference on Neural Networks, IJCNN 2009 - Atlanta, GA, United States
Duration: Jun 14 2009Jun 19 2009

Publication series

NameProceedings of the International Joint Conference on Neural Networks


Other2009 International Joint Conference on Neural Networks, IJCNN 2009
Country/TerritoryUnited States
CityAtlanta, GA


Dive into the research topics of 'Predictive learning with sparse heterogeneous data'. Together they form a unique fingerprint.

Cite this