Abstract
To assess the difference between real and synthetic data, Generative Adversarial Networks (GANs) are trained using a distribution discrepancy measure. Three widely employed measures are information-theoretic divergences, integral probability metrics, and Hilbert space discrepancy metrics. We elucidate the theoretical connections between these three popular GAN training criteria and propose a novel procedure, called x2-GAN, that is conceptually simple, stable at training and resistant to mode collapse. Our procedure naturally generalizes to address the problem of simultaneous matching of multiple distributions. Further, we propose a resampling strategy that significantly improves sample quality, by repurpos-ing the trained critic function via an importance weighting mechanism. Experiments show that the proposed procedure improves stability and convergence, and yields state-of-art results on a wide range of generative modeling tasks.
Original language | English (US) |
---|---|
Title of host publication | 35th International Conference on Machine Learning, ICML 2018 |
Editors | Andreas Krause, Jennifer Dy |
Publisher | International Machine Learning Society (IMLS) |
Pages | 7787-7796 |
Number of pages | 10 |
ISBN (Electronic) | 9781510867963 |
State | Published - 2018 |
Externally published | Yes |
Event | 35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden Duration: Jul 10 2018 → Jul 15 2018 |
Publication series
Name | 35th International Conference on Machine Learning, ICML 2018 |
---|---|
Volume | 11 |
Conference
Conference | 35th International Conference on Machine Learning, ICML 2018 |
---|---|
Country/Territory | Sweden |
City | Stockholm |
Period | 07/10/18 → 07/15/18 |
Bibliographical note
Publisher Copyright:© 2018 by the Authors All rights reserved.
ASJC Scopus subject areas
- Computational Theory and Mathematics
- Human-Computer Interaction
- Software