Abstract
By developing data augmentation methods unique to the negative binomial (NB) distribution, we unite seemingly disjoint count and mixture models under the NB process framework. We develop fundamental properties of the models and derive efficient Gibbs sampling inference. We show that the gamma-NB process can be reduced to the hierarchical Dirichlet process with normalization, highlighting its unique theoretical, structural and computational advantages. A variety of NB processes with distinct sharing mechanisms are constructed and applied to topic modeling, with connections to existing algorithms, showing the importance of inferring both the NB dispersion and probability parameters.
Original language | English (US) |
---|---|
Title of host publication | Advances in Neural Information Processing Systems |
Pages | 2546-2554 |
Number of pages | 9 |
State | Published - Dec 1 2012 |
Externally published | Yes |