PERMUTATION COMPRESSORS FOR PROVABLY FASTER DISTRIBUTED NONCONVEX OPTIMIZATION

Research output: Contribution to conferencePaperpeer-review

3 Scopus citations

Abstract

We study the MARINA method of Gorbunov et al. (2021) - the current state-of-the-art distributed non-convex optimization method in terms of theoretical communication complexity. Theoretical superiority of this method can be largely attributed to two sources: the use of a carefully engineered biased stochastic gradient estimator, which leads to a reduction in the number of communication rounds, and the reliance on independent stochastic communication compression operators, which leads to a reduction in the number of transmitted bits within each communication round. In this paper we i) extend the theory of MARINA to support a much wider class of potentially correlated compressors, extending the reach of the method beyond the classical independent compressors setting, ii) show that a new quantity, for which we coin the name Hessian variance, allows us to significantly refine the original analysis of MARINA without any additional assumptions, and iii) identify a special class of correlated compressors based on the idea of random permutations, for which we coin the term PermK. The use of it leads to O(√n) (resp. O(1 + d/√n)) improvement in the theoretical communication complexity of MARINA in the low Hessian variance regime when d ≥ n (resp. d ≤ n), where n is the number of workers and d is the number of parameters describing the model we are learning. We corroborate our theoretical results with carefully engineered synthetic experiments with minimizing the average of nonconvex quadratics, and on autoencoder training with the MNIST dataset.

Original languageEnglish (US)
StatePublished - 2022
Event10th International Conference on Learning Representations, ICLR 2022 - Virtual, Online
Duration: Apr 25 2022Apr 29 2022

Conference

Conference10th International Conference on Learning Representations, ICLR 2022
CityVirtual, Online
Period04/25/2204/29/22

Bibliographical note

Funding Information:
The work of Rafał Szlendak was performed during a Summer research internship in the Optimization and Machine Learning Lab at KAUST led by Peter Richtárik. Rafał Szlendak is an undergraduate student at the University of Warwick, United Kingdom.

Publisher Copyright:
© 2022 ICLR 2022 - 10th International Conference on Learning Representationss. All rights reserved.

ASJC Scopus subject areas

  • Language and Linguistics
  • Computer Science Applications
  • Education
  • Linguistics and Language

Fingerprint

Dive into the research topics of 'PERMUTATION COMPRESSORS FOR PROVABLY FASTER DISTRIBUTED NONCONVEX OPTIMIZATION'. Together they form a unique fingerprint.

Cite this