Abstract
Supervised contrastive learning provides superior performance over self-supervised learning by considering label information in classification tasks. However, this process suffers from collapsing embedding space since the positive samples are randomly selected from the labeled group and are pulled together. In this work, we theoretically guarantee that any pre-training methods that maintain a mixture of sub-class distribution could consistently outperform supervised contrastive pre-training. Furthermore, based on our theoretical analysis, we propose a new pre-training method by adopting an efficient Expectation Maximization learning strategy. Finally, we empirically evaluated our proposed method of sepsis prediction from the PhysioNet/Computing in Cardiology Challenge dataset and showed its superior performance to the state-of-the-art from various perspectives.
| Original language | English (US) |
|---|---|
| Title of host publication | Proceedings - 2023 IEEE 11th International Conference on Healthcare Informatics, ICHI 2023 |
| Publisher | Institute of Electrical and Electronics Engineers Inc. |
| Pages | 101-110 |
| Number of pages | 10 |
| ISBN (Electronic) | 9798350302639 |
| DOIs | |
| State | Published - 2023 |
| Externally published | Yes |
| Event | 11th IEEE International Conference on Healthcare Informatics, ICHI 2023 - Houston, United States Duration: Jun 26 2023 → Jun 29 2023 |
Publication series
| Name | Proceedings - 2023 IEEE 11th International Conference on Healthcare Informatics, ICHI 2023 |
|---|
Conference
| Conference | 11th IEEE International Conference on Healthcare Informatics, ICHI 2023 |
|---|---|
| Country/Territory | United States |
| City | Houston |
| Period | 6/26/23 → 6/29/23 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.
Keywords
- Pre-training
- Self-supervised pre-training
- Supervised contrastive learning