Estimating Total Correlation with Mutual Information Bounds

Pengyu Cheng, Weituo Hao, Lawrence Carin

Research output: Contribution to journalArticlepeer-review

21 Downloads (Pure)

Abstract

Total correlation (TC) is a fundamental concept in information theory to measure the statistical dependency of multiple random variables. Recently, TC has shown effectiveness as a regularizer in many machine learning tasks when minimizing/maximizing the correlation among random variables is required. However, to obtain precise TC values is challenging, especially when the closed-form distributions of variables are unknown. In this paper, we introduced several sample-based variational TC estimators. Specifically, we connect the TC with mutual information (MI) and constructed two calculation paths to decompose TC into MI terms. In our experiments, we estimated the true TC values with the proposed estimators in different simulation scenarios and analyzed the properties of the TC estimators.
Original languageEnglish (US)
JournalArxiv preprint
StatePublished - Nov 9 2020
Externally publishedYes

Bibliographical note

Accepted by NeurIPS 2020 Workshop DL-IG

Keywords

  • cs.IT
  • math.IT

Fingerprint

Dive into the research topics of 'Estimating Total Correlation with Mutual Information Bounds'. Together they form a unique fingerprint.

Cite this