Abstract
A logistic regression classification algorithm is developed for problems in which the feature vectors may be missing data (features). Single or multiple imputation for the missing data is avoided by performing analytic integration with an estimated conditional density function (conditioned on the non-missing data). Conditional density functions are estimated using a Gaussian mixture model (GMM), with parameter estimation performed using both expectation maximization (EM) and Variational Bayesian EM (VB-EM). Using widely available real data, we demonstrate the general advantage of the VB-EM GMM estimation for handling incomplete data, vis-à-vis the EM algorithm. Moreover, it is demonstrated that the approach proposed here is generally superior to standard imputation procedures.
Original language | English (US) |
---|---|
Title of host publication | ICML 2005 - Proceedings of the 22nd International Conference on Machine Learning |
Pages | 977-984 |
Number of pages | 8 |
State | Published - Dec 1 2005 |
Externally published | Yes |