Abstract
The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty. Existing inference approaches for DGP models have limited scalability and are notoriously cumbersome to construct. In this work we introduce a novel formulation of DGPs based on random feature expansions that we train using stochastic variational inference. This yields a practical learning framework which significantly advances the state-of-the-art in inference for DGPs, and enables accurate quantification of uncertainty. We extensively showcase the scalability and performance of our proposal on several datasets with up to 8 million observations, and various DGP architectures with up to 30 hidden layers.
Original language | English (US) |
---|---|
Title of host publication | 34th International Conference on Machine Learning, ICML 2017 |
Publisher | International Machine Learning Society (IMLS) |
Pages | 1467-1482 |
Number of pages | 16 |
ISBN (Electronic) | 9781510855144 |
State | Published - 2017 |
Event | 34th International Conference on Machine Learning, ICML 2017 - Sydney, Australia Duration: Aug 6 2017 → Aug 11 2017 |
Publication series
Name | 34th International Conference on Machine Learning, ICML 2017 |
---|---|
Volume | 2 |
Conference
Conference | 34th International Conference on Machine Learning, ICML 2017 |
---|---|
Country/Territory | Australia |
City | Sydney |
Period | 08/6/17 → 08/11/17 |
Bibliographical note
Publisher Copyright:© 2017 by the author (s).
ASJC Scopus subject areas
- Computational Theory and Mathematics
- Human-Computer Interaction
- Software