A bregman matrix and the gradient of mutual information for vector poisson and gaussian channels

Liming Wang, David Edwin Carlson, Miguel R.D. Rodrigues, Robert Calderbank, Lawrence Carin

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

A generalization of Bregman divergence is developed and utilized to unify vector Poisson and Gaussian channel models, from the perspective of the gradient of mutual information. The gradient is with respect to the measurement matrix in a compressive-sensing setting, and mutual information is considered for signal recovery and classification. Existing gradient-of-mutual-information results for scalar Poisson models are recovered as special cases, as are known results for the vector Gaussian model. The Bregman-divergence generalization yields a Bregman matrix, and this matrix induces numerous matrix-valued metrics. The metrics associated with the Bregman matrix are detailed, as are its other properties. The Bregman matrix is also utilized to connect the relative entropy and mismatched minimum mean squared error. Two applications are considered: 1) compressive sensing with a Poisson measurement model and 2) compressive topic modeling for analysis of a document corpora (word-count data). In both of these settings, we use the developed theory to optimize the compressive measurement matrix, for signal recovery and classification. © 1963-2012 IEEE.
Original languageEnglish (US)
Pages (from-to)2611-2629
Number of pages19
JournalIEEE Transactions on Information Theory
Volume60
Issue number5
DOIs
StatePublished - Jan 1 2014
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2021-02-09

Fingerprint

Dive into the research topics of 'A bregman matrix and the gradient of mutual information for vector poisson and gaussian channels'. Together they form a unique fingerprint.

Cite this