Abstract
Learning multiple tasks across heterogeneous domains is a challenging problem since the feature space may not be the same for different tasks. We assume the data in multiple tasks are generated from a latent common domain via sparse domain transforms and propose a latent probit model (LPM) to jointly learn the domain transforms, and a probit classifier shared in the common domain. To learn meaningful task relatedness and avoid over-fitting in classification, we introduce sparsity in the domain transforms matrices, as well as in the common classifier parameters. We derive theoretical bounds for the estimation error of the classifier parameters in terms of the sparsity of domain transform matrices. An expectation-maximization algorithm is derived for learning the LPM. The effectiveness of the approach is demonstrated on several real datasets. Copyright 2012 by the author(s)/owner(s).
Original language | English (US) |
---|---|
Title of host publication | Proceedings of the 29th International Conference on Machine Learning, ICML 2012 |
Pages | 1463-1470 |
Number of pages | 8 |
State | Published - Oct 10 2012 |
Externally published | Yes |