Abstract
The deployment of Federated Learning (FL) systems poses various challenges such as data heterogeneity and communication efficiency. We focus on a practical FL setup that has recently drawn attention, where the data distribution on each device is not static but dynamically evolves over time. This setup, referred to as Continual Federated Learning (CFL), suffers from catastrophic forgetting, i.e., the undesired forgetting of previous knowledge after learning on new data, an issue not encountered with vanilla FL. In this work, we formally quantify catastrophic forgetting in a CFL setup, establish links to training optimization and evaluate different episodic replay approaches for CFL on a large scale real-world NLP dataset. To the best of our knowledge, this is the first such study of episodic replay for CFL. We show that storing a small set of past data boosts performance and significantly reduce forgetting, providing evidence that carefully designed sampling strategies can lead to further improvements.
Original language | English (US) |
---|---|
Title of host publication | ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing, Proceedings |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
ISBN (Electronic) | 9781728163277 |
DOIs | |
State | Published - 2023 |
Externally published | Yes |
Event | 48th IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2023 - Rhodes Island, Greece Duration: Jun 4 2023 → Jun 10 2023 |
Publication series
Name | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings |
---|---|
Volume | 2023-June |
ISSN (Print) | 1520-6149 |
Conference
Conference | 48th IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2023 |
---|---|
Country/Territory | Greece |
City | Rhodes Island |
Period | 6/4/23 → 6/10/23 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.