Faster On-Device Training Using New Federated Momentum Algorithm

Zhouyuan Huo, Qian Yang, Bin Gu, Heng Huang, Lawrence Carin

Research output: Contribution to journalArticlepeer-review

30 Downloads (Pure)


Mobile crowdsensing has gained significant attention in recent years and has become a critical paradigm for emerging Internet of Things applications. The sensing devices continuously generate a significant quantity of data, which provide tremendous opportunities to develop innovative intelligent applications. To utilize these data to train machine learning models while not compromising user privacy, federated learning has become a promising solution. However, there is little understanding of whether federated learning algorithms are guaranteed to converge. We reconsider model averaging in federated learning and formulate it as a gradient-based method with biased gradients. This novel perspective assists analysis of its convergence rate and provides a new direction for more acceleration. We prove for the first time that the federated averaging algorithm is guaranteed to converge for non-convex problems, without imposing additional assumptions. We further propose a novel accelerated federated learning algorithm and provide a convergence guarantee. Simulated federated learning experiments are conducted to train deep neural networks on benchmark datasets, and experimental results show that our proposed method converges faster than previous approaches.
Original languageEnglish (US)
JournalArxiv preprint
StatePublished - Feb 6 2020
Externally publishedYes


  • cs.LG
  • cs.DC
  • stat.ML


Dive into the research topics of 'Faster On-Device Training Using New Federated Momentum Algorithm'. Together they form a unique fingerprint.

Cite this