Variance reduction in stochastic particle-optimization sampling

Jianyi Zhang, Yang Zhao, Ruiyi Zhang, Lawrence Carin, Changyou Chen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Stochastic particle-optimization sampling (SPOS) is a recently-developed scalable Bayesian sampling framework unifying stochastic gradient MCMC (SG-MCMC) and Stein variational gradient descent (SVGD) algorithms based on Wasserstein gradient flows. With a rigorous nonasymptotic convergence theory developed, SPOS can avoid the particle-collapsing pitfall of SVGD. However, the variance-reduction effect in SPOS has not been clear. In this paper, we address this gap by presenting several variancereduction techniques for SPOS. Specifically, we propose three variants of variance-reduced SPOS, called SAGA particle-optimization sampling (SAGA-POS), SVRG particle-optimization sampling (SVRG-POS) and a variant of SVRGPOS which avoids full gradient computations, denoted as SVRG-POS+. Importantly, we provide non-Asymptotic convergence guarantees for these algorithms in terms of the 2-Wasserstein metric and analyze their complexities. The results show our algorithms yield better convergence rates than existing variance-reduced variants of stochastic Langevin dynamics, though more space is required to store the particles in training. Our theory aligns well with experimental results on both synthetic and real datasets.

Original languageEnglish (US)
Title of host publication37th International Conference on Machine Learning, ICML 2020
EditorsHal Daume, Aarti Singh
PublisherInternational Machine Learning Society (IMLS)
Pages11244-11253
Number of pages10
ISBN (Electronic)9781713821120
StatePublished - 2020
Event37th International Conference on Machine Learning, ICML 2020 - Virtual, Online
Duration: Jul 13 2020Jul 18 2020

Publication series

Name37th International Conference on Machine Learning, ICML 2020
VolumePartF168147-15

Conference

Conference37th International Conference on Machine Learning, ICML 2020
CityVirtual, Online
Period07/13/2007/18/20

Bibliographical note

Funding Information:
The research performed at Duke University was supported in part by DARPA, DOE, NSF and ONR.

Publisher Copyright:
© 2020 by the Authors All rights reserved.

Copyright:
Copyright 2021 Elsevier B.V., All rights reserved.

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Variance reduction in stochastic particle-optimization sampling'. Together they form a unique fingerprint.

Cite this