Information content and analysis methods for multi-modal high-throughput biomedical data

Bisakha Ray, Mikael Henaff, Sisi Ma, Efstratios Efstathiadis, Eric R. Peskin, Marco Picone, Tito Poli, Constantin F. Aliferis, Alexander Statnikov

Research output: Contribution to journalArticlepeer-review

21 Scopus citations

Abstract

The spectrum of modern molecular high-throughput assaying includes diverse technologies such as microarray gene expression, miRNA expression, proteomics, DNA methylation, among many others. Now that these technologies have matured and become increasingly accessible, the next frontier is to collect "multi-modal" data for the same set of subjects and conduct integrative, multi-level analyses. While multi-modal data does contain distinct biological information that can be useful for answering complex biology questions, its value for predicting clinical phenotypes and contributions of each type of input remain unknown. We obtained 47 datasets/predictive tasks that in total span over 9 data modalities and executed analytic experiments for predicting various clinical phenotypes and outcomes. First, we analyzed each modality separately using uni-modal approaches based on several state-of-the-art supervised classification and feature selection methods. Then, we applied integrative multi-modal classification techniques. We have found that gene expression is the most predictively informative modality. Other modalities such as protein expression, miRNA expression, and DNA methylation also provide highly predictive results, which are often statistically comparable but not superior to gene expression data. Integrative multi-modal analyses generally do not increase predictive signal compared to gene expression data.

Original languageEnglish (US)
Article number4411
JournalScientific reports
Volume4
DOIs
StatePublished - Mar 21 2014

Fingerprint Dive into the research topics of 'Information content and analysis methods for multi-modal high-throughput biomedical data'. Together they form a unique fingerprint.

Cite this