Abstract
We introduce a novel generative formulation of deep probabilistic models implementing "soft" constraints on their function dynamics. In particular, we develop a flexible methodological framework where the modeled functions and derivatives of a given order are subject to inequality or equality constraints. We then characterize the posterior distribution over model and constraint parameters through stochastic variational inference. As a result, the proposed approach allows for accurate and scalable uncertainty quantification on the predictions and on all parameters. We demonstrate the application of equality constraints in the challenging problem of parameter inference in ordinary differential equation models, while we showcase the application of inequality constraints on the problem of monotonic regression of count data. The proposed approach is extensively tested in several experimental settings, leading to highly competitive results in challenging modeling applications, while offering high expressiveness, flexi-bility and scalability.
Original language | English (US) |
---|---|
Title of host publication | 35th International Conference on Machine Learning, ICML 2018 |
Editors | Jennifer Dy, Andreas Krause |
Publisher | International Machine Learning Society (IMLS) |
Pages | 5089-5101 |
Number of pages | 13 |
ISBN (Electronic) | 9781510867963 |
State | Published - 2018 |
Event | 35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden Duration: Jul 10 2018 → Jul 15 2018 |
Publication series
Name | 35th International Conference on Machine Learning, ICML 2018 |
---|---|
Volume | 7 |
Conference
Conference | 35th International Conference on Machine Learning, ICML 2018 |
---|---|
Country/Territory | Sweden |
City | Stockholm |
Period | 07/10/18 → 07/15/18 |
Bibliographical note
Publisher Copyright:© Copyright 2018 by the author(s).
ASJC Scopus subject areas
- Computational Theory and Mathematics
- Human-Computer Interaction
- Software