Abstract
Recent advances on the scalability and flexibility of variational inference have made it successful at unravelling hidden patterns in complex data. In this work we propose a new variational bound formulation, yielding an estimator that extends beyond the conventional variational bound. It naturally subsumes the importance-weighted and Renyi bounds as special cases, and it is provably sharper than these counterparts. We also present an improved estimator for variational learning, and advocate a novel high signal-to-variance ratio update rule for the variational parameters. We discuss model-selection issues associated with existing evidence-lower-bound-based variational inference procedures, and show how to leverage the flexibility of our new formulation to address them. Empirical evidence is provided to validate our claims.
Original language | English (US) |
---|---|
Title of host publication | 35th International Conference on Machine Learning, ICML 2018 |
Publisher | International Machine Learning Society (IMLS)[email protected] |
Pages | 1419-1435 |
Number of pages | 17 |
ISBN (Print) | 9781510867963 |
State | Published - Jan 1 2018 |
Externally published | Yes |