DeTrust-FL: Privacy-Preserving Federated Learning in Decentralized Trust Setting

Runhua Xu, Nathalie Baracaldo, Yi Zhou, Ali Anwar, Swanand Kadhe, Heiko Ludwig

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Scopus citations

Abstract

Federated learning has emerged as a privacy-preserving machine learning approach where multiple parties can train a single model without sharing their raw training data. Federated learning typically requires the utilization of multi-party computation techniques to provide strong privacy guarantees by ensuring that an untrusted or curious aggregator cannot obtain isolated replies from parties involved in the training process, thereby preventing potential inference attacks. Until recently, it was thought that some of these secure aggregation techniques were sufficient to fully protect against inference attacks coming from a curious aggregator. However, recent research has demonstrated that a curious aggregator can successfully launch a disaggregation attack to learn information about model updates of a target party. This paper presents DeTrust-FL, an efficient privacy-preserving federated learning framework for addressing the lack of transparency that enables isolation attacks, such as disaggregation attacks, during secure aggregation by assuring that parties' model updates are included in the aggregated model in a private and secure manner. DeTrust-FL proposes a decentralized trust consensus mechanism and incorporates a recently proposed decentralized functional encryption scheme in which all parties agree on a participation matrix before collaboratively generating decryption key fragments, thereby gaining control and trust over the secure aggregation process in a decentralized setting. Our experimental evaluation demonstrates that DeTrust-FL outperforms state-of-the-art FE-based secure multi-party aggregation solutions in terms of training time and reduces the volume of data transferred. In contrast to existing approaches, this is achieved without creating any trust dependency on external trusted entities.

Original languageEnglish (US)
Title of host publicationProceedings - 2022 IEEE 15th International Conference on Cloud Computing, CLOUD 2022
EditorsClaudio Agostino Ardagna, Nimanthi Atukorala, Rajkumar Buyya, Carl K. Chang, Rong N. Chang, Ernesto Damiani, Gargi Banerjee Dasgupta, Fabrizio Gagliardi, Christoph Hagleitner, Dejan Milojicic, Tuan M Hoang Trong, Robert Ward, Fatos Xhafa, Jia Zhang
PublisherIEEE Computer Society
Pages417-426
Number of pages10
ISBN (Electronic)9781665481373
DOIs
StatePublished - 2022
Externally publishedYes
Event15th IEEE International Conference on Cloud Computing, CLOUD 2022 - Barcelona, Spain
Duration: Jul 10 2021Jul 16 2021

Publication series

NameIEEE International Conference on Cloud Computing, CLOUD
Volume2022-July
ISSN (Print)2159-6182
ISSN (Electronic)2159-6190

Conference

Conference15th IEEE International Conference on Cloud Computing, CLOUD 2022
Country/TerritorySpain
CityBarcelona
Period7/10/217/16/21

Bibliographical note

Publisher Copyright:
© 2022 IEEE.

Keywords

  • decentralized functional encryption
  • decentralized trust
  • federated learning
  • privacy-enhanced computing
  • secure multi-party aggregation

Fingerprint

Dive into the research topics of 'DeTrust-FL: Privacy-Preserving Federated Learning in Decentralized Trust Setting'. Together they form a unique fingerprint.

Cite this