FedNL: Making Newton-Type Methods Applicable to Federated Learning

Mher Safaryan*, Rustem Islamov, Xun Qian, Peter Richtárik

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

14 Scopus citations

Abstract

Inspired by recent work of Islamov et al (2021), we propose a family of Federated Newton Learn (FedNL) methods, which we believe is a marked step in the direction of making second-order methods applicable to FL. In contrast to the aforementioned work, FedNL employs a different Hessian learning technique which i) enhances privacy as it does not rely on the training data to be revealed to the coordinating server, ii) makes it applicable beyond generalized linear models, and iii) provably works with general contractive compression operators for compressing the local Hessians, such as Top-K or Rank-R, which are vastly superior in practice. Notably, we do not need to rely on error feedback for our methods to work with contractive compressors. Moreover, we develop FedNL-PP, FedNL-CR and FedNL-LS, which are variants of FedNL that support partial participation, and globalization via cubic regularization and line search, respectively, and FedNL-BC, which is a variant that can further benefit from bidirectional compression of gradients and models, i.e., smart uplink gradient and smart downlink model compression. We prove local convergence rates that are independent of the condition number, the number of training data points, and compression variance. Our communication efficient Hessian learning technique provably learns the Hessian at the optimum. Finally, we perform a variety of numerical experiments that show that our FedNL methods have state-of-the-art communication complexity when compared to key baselines.

Original languageEnglish (US)
Pages18959-19010
Number of pages52
StatePublished - 2022
Event39th International Conference on Machine Learning, ICML 2022 - Baltimore, United States
Duration: Jul 17 2022Jul 23 2022

Conference

Conference39th International Conference on Machine Learning, ICML 2022
Country/TerritoryUnited States
CityBaltimore
Period07/17/2207/23/22

Bibliographical note

Publisher Copyright:
Copyright © 2022 by the author(s)

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'FedNL: Making Newton-Type Methods Applicable to Federated Learning'. Together they form a unique fingerprint.

Cite this