Abstract
Stein Variational Gradient Descent (SVGD) is an important alternative to the Langevin-type algorithms for sampling from probability distributions of the form π(x) ∝ exp(−V (x)). In the existing theory of Langevin-type algorithms and SVGD, the potential function V is often assumed to be L-smooth. However, this restrictive condition excludes a large class of potential functions such as polynomials of degree greater than 2. Our paper studies the convergence of the SVGD algorithm in population limit for distributions with (L0, L1)-smooth potentials. This relaxed smoothness assumption was introduced by Zhang et al. (2019a) for the analysis of gradient clipping algorithms. With the help of trajectory-independent auxiliary conditions, we provide a descent lemma establishing that the algorithm decreases the KL divergence at each iteration and prove a complexity bound for SVGD in the population limit in terms of the Stein Fisher information.
Original language | English (US) |
---|---|
Pages | 3693-3717 |
Number of pages | 25 |
State | Published - 2023 |
Event | 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 - Valencia, Spain Duration: Apr 25 2023 → Apr 27 2023 |
Conference
Conference | 26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 |
---|---|
Country/Territory | Spain |
City | Valencia |
Period | 04/25/23 → 04/27/23 |
Bibliographical note
Publisher Copyright:Copyright © 2023 by the author(s)
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability