Straight-Through Estimator as Projected Wasserstein Gradient Flow

Pengyu Cheng, Chang Liu, Chunyuan Li, Dinghan Shen, Ricardo Henao, Lawrence Carin

Research output: Contribution to journalArticlepeer-review

23 Downloads (Pure)


The Straight-Through (ST) estimator is a widely used technique for back-propagating gradients through discrete random variables. However, this effective method lacks theoretical justification. In this paper, we show that ST can be interpreted as the simulation of the projected Wasserstein gradient flow (pWGF). Based on this understanding, a theoretical foundation is established to justify the convergence properties of ST. Further, another pWGF estimator variant is proposed, which exhibits superior performance on distributions with infinite support,e.g., Poisson distributions. Empirically, we show that ST and our proposed estimator, while applied to different types of discrete structures (including both Bernoulli and Poisson latent variables), exhibit comparable or even better performances relative to other state-of-the-art methods. Our results uncover the origin of the widespread adoption of the ST estimator and represent a helpful step towards exploring alternative gradient estimators for discrete variables.
Original languageEnglish (US)
JournalArxiv preprint
StatePublished - Oct 5 2019
Externally publishedYes

Bibliographical note

Accepted as NeurIPS 2018 Bayesian Deep Learning Workshop


  • cs.LG
  • stat.ML


Dive into the research topics of 'Straight-Through Estimator as Projected Wasserstein Gradient Flow'. Together they form a unique fingerprint.

Cite this