Entropie causality anc greedy minimum entropy coupling

Murat Kocaoglu, Alexandros G. Dimakis, Ram Srivishwanath, Babak Hassibi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Scopus citations

Abstract

We study the problem of identifying the causal relationship between two discrete random variables from observational data. We recently proposed a novel framework called entropie causality that works in a very general functional model but makes the assumption that the unobserved exogenous variable has small entropy in the true causal direction. This framework requires the solution of a minimum entropy coupling problem: Given marginal distributions of m discrete random variables, each on n states, find the joint distribution with minimum entropy, that respects the given marginals. This corresponds to minimizing a concave function of nm variables over a convex polytope defined by nm linear constraints, called a transportation polytope. Unfortunately, it was recently shown that this minimum entropy coupling problem is NP-hard, even for 2 variables with n states. Even representing points (joint distributions) over this space can require exponential complexity (in n, m) if done naively. In our recent work we introduced an efficient greedy algorithm to find an approximate solution for this problem. In this paper we analyze this algorithm and establish two results: that our algorithm always finds a local minimum and also is within an additive approximation error from the unknown global optimum.
Original languageEnglish (US)
Title of host publication2017 IEEE International Symposium on Information Theory (ISIT)
PublisherIEEE
Pages1465-1469
Number of pages5
ISBN (Print)9781509040964
DOIs
StatePublished - Aug 15 2017
Externally publishedYes

Cite this