SwiftAgg+: Achieving Asymptotically Optimal Communication Loads in Secure Aggregation for Federated Learning

Tayyebeh Jahani-Nezhad, Mohammad Ali Maddah-Ali, Songze Li, Giuseppe Caire

Research output: Contribution to journalArticlepeer-review

6 Scopus citations


We propose SwiftAgg+, a novel secure aggregation protocol for federated learning systems, where a central server aggregates local models of $N \in \mathbb {N}$ distributed users, each of size $L \in \mathbb {N}$ , trained on their local data, in a privacy-preserving manner. SwiftAgg+ can significantly reduce the communication overheads without any compromise on security, and achieve optimal communication loads within diminishing gaps. Specifically, in presence of at most $D=o(N)$ dropout users, SwiftAgg+ achieves a per-user communication load of $\left({1+\mathcal {O}\left({\frac {1}{N}}\right)}\right)L$ symbols and a server communication load of $\left({1+\mathcal {O}\left({\frac {1}{N}}\right)}\right)L$ symbols, with a worst-case information-theoretic security guarantee, against any subset of up to $T=o(N)$ semi-honest users who may also collude with the curious server. Moreover, the proposed SwiftAgg+ allows for a flexible trade-off between communication loads and the number of active communication links. In particular, for $T< N-D$ and for any $K\in \mathbb {N}$ , SwiftAgg+ can achieve the server communication load of $\left({1+\frac {T}{K}}\right)L$ symbols, and per-user communication load of up to $\left({1+\frac {T+D}{K}}\right)L$ symbols, where the number of pair-wise active connections in the network is $\frac {N}{2}(K+T+D+1)$.

Original languageEnglish (US)
Pages (from-to)977-989
Number of pages13
JournalIEEE Journal on Selected Areas in Communications
Issue number4
StatePublished - Apr 1 2023

Bibliographical note

Publisher Copyright:
© 1983-2012 IEEE.


  • Federated learning
  • dropout resiliency
  • optimal communication load
  • secret sharing
  • secure aggregation


Dive into the research topics of 'SwiftAgg+: Achieving Asymptotically Optimal Communication Loads in Secure Aggregation for Federated Learning'. Together they form a unique fingerprint.

Cite this