Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition

Research output: Contribution to conferencePaperpeer-review

6 Scopus citations

Abstract

Stein Variational Gradient Descent (SVGD) is an important alternative to the Langevin-type algorithms for sampling from probability distributions of the form π(x) ∝ exp(−V (x)). In the existing theory of Langevin-type algorithms and SVGD, the potential function V is often assumed to be L-smooth. However, this restrictive condition excludes a large class of potential functions such as polynomials of degree greater than 2. Our paper studies the convergence of the SVGD algorithm in population limit for distributions with (L0, L1)-smooth potentials. This relaxed smoothness assumption was introduced by Zhang et al. (2019a) for the analysis of gradient clipping algorithms. With the help of trajectory-independent auxiliary conditions, we provide a descent lemma establishing that the algorithm decreases the KL divergence at each iteration and prove a complexity bound for SVGD in the population limit in terms of the Stein Fisher information.

Original languageEnglish (US)
Pages3693-3717
Number of pages25
StatePublished - 2023
Event26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 - Valencia, Spain
Duration: Apr 25 2023Apr 27 2023

Conference

Conference26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023
Country/TerritorySpain
CityValencia
Period04/25/2304/27/23

Bibliographical note

Publisher Copyright:
Copyright © 2023 by the author(s)

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition'. Together they form a unique fingerprint.

Cite this