FedBCD: A Communication-Efficient Collaborative Learning Framework for Distributed Features

Yang Liu, Xinwei Zhang, Yan Kang, Liping Li, Tianjian Chen, Mingyi Hong, Qiang Yang

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


We introduce a novel federated learning framework allowing multiple parties having different sets of attributes about the same user to jointly build models without exposing their raw data or model parameters. Conventional federated learning approaches are inefficient for cross-silo problems because they require the exchange of messages for gradient updates at every iteration, and raise security concerns over sharing such messages during learning. We propose a Federated Stochastic Block Coordinate Descent (FedBCD) algorithm, allowing each party to conduct multiple local updates before each communication to effectively reduce communication overhead. Under a practical security model, we show that parties cannot infer others' exact raw data ('deep leakage') from collections of messages exchanged in our framework, regardless of the number of communication to be performed. Further, we provide convergence guarantees and empirical evaluations on a variety of tasks and datasets, demonstrating significant improvement inefficiency.

Original languageEnglish (US)
Pages (from-to)4277-4290
Number of pages14
JournalIEEE Transactions on Signal Processing
StatePublished - 2022

Bibliographical note

Publisher Copyright:
© 1991-2012 IEEE.


  • cross-silo federated learning
  • data privacy
  • distributed features
  • Federated learning
  • federated stochastic block coordinate descent


Dive into the research topics of 'FedBCD: A Communication-Efficient Collaborative Learning Framework for Distributed Features'. Together they form a unique fingerprint.

Cite this