Variational inference and model selection with generalized evidence bounds

Chenyang Tao, Liqun Chen, Ruiyi Zhang, Ricardo Henao, Lawrence Carin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

15 Scopus citations

Abstract

Recent advances on the scalability and flexibility of variational inference have made it successful at unravelling hidden patterns in complex data. In this work we propose a new variational bound formulation, yielding an estimator that extends beyond the conventional variational bound. It naturally subsumes the importance-weighted and Renyi bounds as special cases, and it is provably sharper than these counterparts. We also present an improved estimator for variational learning, and advocate a novel high signal-to-variance ratio update rule for the variational parameters. We discuss model-selection issues associated with existing evidence-lower-bound-based variational inference procedures, and show how to leverage the flexibility of our new formulation to address them. Empirical evidence is provided to validate our claims.
Original languageEnglish (US)
Title of host publication35th International Conference on Machine Learning, ICML 2018
PublisherInternational Machine Learning Society (IMLS)[email protected]
Pages1419-1435
Number of pages17
ISBN (Print)9781510867963
StatePublished - Jan 1 2018
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2021-02-09

Fingerprint

Dive into the research topics of 'Variational inference and model selection with generalized evidence bounds'. Together they form a unique fingerprint.

Cite this