REFL: Resource-Efficient Federated Learning

Ahmed M. Abdelmoniem*, Atal Narayan Sahu, Marco Canini, Suhaib A. Fahmy

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations


Federated Learning (FL) enables distributed training by learners using local data, thereby enhancing privacy and reducing communication. However, it presents numerous challenges relating to the heterogeneity of the data distribution, device capabilities, and participant availability as deployments scale, which can impact both model convergence and bias. Existing FL schemes use random participant selection to improve the fairness of the selection process; however, this can result in inefficient use of resources and lower quality training. In this work, we systematically address the question of resource efficiency in FL, showing the benefits of intelligent participant selection, and incorporation of updates from straggling participants. We demonstrate how these factors enable resource efficiency while also improving trained model quality.

Original languageEnglish (US)
Title of host publicationProceedings of the 18th European Conference on Computer Systems, EuroSys 2023
PublisherAssociation for Computing Machinery, Inc
Number of pages18
ISBN (Electronic)9781450394871
StatePublished - May 8 2023
Event18th European Conference on Computer Systems, EuroSys 2023 - Rome, Italy
Duration: May 8 2023May 12 2023

Publication series

NameProceedings of the 18th European Conference on Computer Systems, EuroSys 2023


Conference18th European Conference on Computer Systems, EuroSys 2023

Bibliographical note

Publisher Copyright:
© 2023 Copyright held by the owner/author(s). Publication rights licensed to ACM.

ASJC Scopus subject areas

  • Information Systems
  • Hardware and Architecture


Dive into the research topics of 'REFL: Resource-Efficient Federated Learning'. Together they form a unique fingerprint.

Cite this