Stochastic proximal langevin algorithm: Potential splitting and nonasymptotic rates

Adil Salim, Dmitry Kovalev, Peter Richtárik

Research output: Contribution to conferencePaperpeer-review

10 Scopus citations

Abstract

We propose a new algorithm-Stochastic Proximal Langevin Algorithm (SPLA)-for sampling from a log concave distribution. Our method is a generalization of the Langevin algorithm to potentials expressed as the sum of one stochastic smooth term and multiple stochastic nonsmooth terms. In each iteration, our splitting technique only requires access to a stochastic gradient of the smooth term and a stochastic proximal operator for each of the nonsmooth terms. We establish nonasymptotic sublinear and linear convergence rates under convexity and strong convexity of the smooth term, respectively, expressed in terms of the KL divergence and Wasserstein distance. We illustrate the efficiency of our sampling technique through numerical simulations on a Bayesian learning task.

Original languageEnglish (US)
StatePublished - 2019
Event33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada
Duration: Dec 8 2019Dec 14 2019

Conference

Conference33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
Country/TerritoryCanada
CityVancouver
Period12/8/1912/14/19

Bibliographical note

Publisher Copyright:
© 2019 Neural information processing systems foundation. All rights reserved.

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Stochastic proximal langevin algorithm: Potential splitting and nonasymptotic rates'. Together they form a unique fingerprint.

Cite this