Abstract
We present a theoretical study of server-side optimization in federated learning. Our results are the first to show that the widely popular heuristic of scaling the client updates with an extra parameter is very useful in the context of Federated Averaging (FedAvg) with local passes over the client data. Each local pass is performed without replacement using Random Reshuffling, which is a key reason we can show improved complexities. In particular, we prove that whenever the local stepsizes are small, and the update direction is given by FedAvg in conjunction with Random Reshuffling over all clients, one can take a big leap in the obtained direction and improve rates for convex, strongly convex, and non-convex objectives. In particular, in non-convex regime we get an enhancement of the rate of convergence from <scr>O</scr> (ϵ-3) to <scr>O</scr> (ϵ-2). This result is new even for Random Reshuffling performed on a single node. In contrast, if the local stepsizes are large, we prove that the noise of client sampling can be controlled by using a small server-side stepsize. To the best of our knowledge, this is the first time that local steps provably help to overcome the communication bottleneck. Together, our results on the advantage of large and small server-side stepsizes give a formal justification for the practice of adaptive server-side optimization in federated learning. Moreover, we consider a variant of our algorithm that supports partial client participation, which makes the method more practical.
Original language | English (US) |
---|---|
Title of host publication | DistributedML 2023 - Proceedings of the 4th International Workshop on Distributed Machine Learning |
Publisher | Association for Computing Machinery, Inc |
Pages | 85-104 |
Number of pages | 20 |
ISBN (Electronic) | 9798400704475 |
DOIs | |
State | Published - Dec 8 2023 |
Event | 4th International Workshop on Distributed Machine Learning, DistributedML 2023 - Paris, France Duration: Dec 8 2023 → … |
Publication series
Name | DistributedML 2023 - Proceedings of the 4th International Workshop on Distributed Machine Learning |
---|
Conference
Conference | 4th International Workshop on Distributed Machine Learning, DistributedML 2023 |
---|---|
Country/Territory | France |
City | Paris |
Period | 12/8/23 → … |
Bibliographical note
Publisher Copyright:© 2023 Owner/Author.
Keywords
- distributed optimization
- federated learning
ASJC Scopus subject areas
- Computer Networks and Communications
- Computer Science Applications
- Hardware and Architecture