Information-theoreticmeasures, such as the entropy, the cross-entropy and the Kullback-Leibler divergence between two mixture models, are core primitives in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte Carlo stochastic integration, approximated or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy, the Kullback-Leibler and the α-divergences of mixtures. We illustrate the versatile method by reporting our experiments for approximating the Kullback-Leibler and the α-divergences between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures and Gamma mixtures.
Bibliographical noteKAUST Repository Item: Exported on 2020-10-01
Acknowledgements: The authors gratefully thank the referees for their comments. This work was carried out while Ke Sun was visiting Frank Nielsen at Ecole Polytechnique, Palaiseau, France.