Poster: Exploiting Data Heterogeneity for Performance and Reliability in Federated Learning

Yuanli Wang, Dhruv Kumar, Abhishek Chandra

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Federated Learning [1] enables distributed devices to learn a shared machine learning model together, without uploading their private training data. It has received significant attention recently and has been used in mobile applications such as search suggestion [2] and object detection [3]. Federated Learning is different from distributed machine learning due to the following reasons: 1) System heterogeneity: federated learning is usually performed on devices having highly dynamic and heterogeneous network, compute, and power availability. 2) Data heterogeneity (or statistical heterogeneity): data is produced by different users on different devices, and therefore may have different statistical distribution (non-IID).

Original languageEnglish (US)
Title of host publicationProceedings - 2020 IEEE/ACM Symposium on Edge Computing, SEC 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages164-166
Number of pages3
ISBN (Electronic)9781728159430
DOIs
StatePublished - Nov 2020
Event5th IEEE/ACM Symposium on Edge Computing, SEC 2020 - Virtual, San Jose, United States
Duration: Nov 11 2020Nov 13 2020

Publication series

NameProceedings - 2020 IEEE/ACM Symposium on Edge Computing, SEC 2020

Conference

Conference5th IEEE/ACM Symposium on Edge Computing, SEC 2020
Country/TerritoryUnited States
CityVirtual, San Jose
Period11/11/2011/13/20

Bibliographical note

Funding Information:
∗This research was supported in part by NSF under grant CNS-1717834.

Publisher Copyright:
© 2020 IEEE.

Copyright:
Copyright 2021 Elsevier B.V., All rights reserved.

Fingerprint

Dive into the research topics of 'Poster: Exploiting Data Heterogeneity for Performance and Reliability in Federated Learning'. Together they form a unique fingerprint.

Cite this