Abstract
Recent advances in stochastic gradient techniques have made it possible to estimate posterior distributions from large datasets via Markov Chain Monte Carlo (MCMC). However, when the target posterior is multimodal, mixing performance is often poor. This results in inadequate exploration of the posterior distribution. A framework is proposed to improve the sampling efficiency of stochastic gradient MCMC, based on Hamiltonian Monte Carlo. A generalized kinetic function is leveraged, delivering superior stationary mixing, especially for multimodal distributions. Techniques are also discussed to overcome the practical issues introduced by this generalization. It is shown that the proposed approach is better at exploring complex multimodal posterior distributions, as demonstrated on multiple applications and in comparison with other stochastic gradient MCMC methods.
Original language | English (US) |
---|---|
Title of host publication | 34th International Conference on Machine Learning, ICML 2017 |
Publisher | International Machine Learning Society (IMLS)[email protected] |
Pages | 6083-6092 |
Number of pages | 10 |
ISBN (Print) | 9781510855144 |
State | Published - Jan 1 2017 |
Externally published | Yes |