Abstract
We study online algorithms for selective sampling that use regularized least squares (RLS) as base classifier. These algorithms typically perform well in practice, and some of them have formal guarantees on their mistake and query rates. We refine and extend these guarantees in various ways, proposing algorithmic variants that exhibit better empirical behavior while enjoying performance guarantees under much more general conditions. We also show a simple way of coupling a generic gradient-based classifier with a specific RLS-based selective sampler, obtaining hybrid algorithms with combined performance guarantees. Copyright 2011 by the author(s)/owner(s).
Original language | English (US) |
---|---|
Title of host publication | Proceedings of the 28th International Conference on Machine Learning, ICML 2011 |
Pages | 433-440 |
Number of pages | 8 |
State | Published - Oct 7 2011 |
Externally published | Yes |