Quantifying Catastrophic Forgetting in Continual Federated Learning

Christophe Dupuy, Jimit Majmudar, Jixuan Wang, Tanya G. Roosta, Rahul Gupta, Clement Chung, Jie Ding, Salman Avestimehr

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations


The deployment of Federated Learning (FL) systems poses various challenges such as data heterogeneity and communication efficiency. We focus on a practical FL setup that has recently drawn attention, where the data distribution on each device is not static but dynamically evolves over time. This setup, referred to as Continual Federated Learning (CFL), suffers from catastrophic forgetting, i.e., the undesired forgetting of previous knowledge after learning on new data, an issue not encountered with vanilla FL. In this work, we formally quantify catastrophic forgetting in a CFL setup, establish links to training optimization and evaluate different episodic replay approaches for CFL on a large scale real-world NLP dataset. To the best of our knowledge, this is the first such study of episodic replay for CFL. We show that storing a small set of past data boosts performance and significantly reduce forgetting, providing evidence that carefully designed sampling strategies can lead to further improvements.

Bibliographical note

Publisher Copyright:
© 2023 IEEE.


Dive into the research topics of 'Quantifying Catastrophic Forgetting in Continual Federated Learning'. Together they form a unique fingerprint.

Cite this