Tighter Theory for Local SGD on Identical and Heterogeneous Data

Ahmed Khaled, Konstantin Mishchenko, Peter Richtarik

Research output: Chapter in Book/Report/Conference proceedingConference contribution


We provide a new analysis of local SGD, removing unnecessary assumptions and elaborating on the difference between two data regimes: identical and heterogeneous. In both cases, we improve the existing theory and provide values of the optimal stepsize and optimal number of local iterations. Our bounds are based on a new notion of variance that is specific to local SGD methods with different data. The tightness of our results is guaranteed by recovering known statements when we plug H “ 1, where H is the number of local steps. The empirical evidence further validates the severe impact of data heterogeneity on the performance of local SGD.
Original languageEnglish (US)
Title of host publicationNeurIPS 2019 Federated Learning Workshop
Number of pages10
StatePublished - 2020

Bibliographical note

KAUST Repository Item: Exported on 2021-09-02


Dive into the research topics of 'Tighter Theory for Local SGD on Identical and Heterogeneous Data'. Together they form a unique fingerprint.

Cite this