Robust bounds for classification via selective sampling

Cesa Bianchi Nicolò, Claudio Gentile, Francesco Orabona

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

We introduce a new algorithm for binary classification in the selective sampling protocol. Our algorithm uses Regularized Least Squares (RLS) as base classifier, and for this reason it can be efficiently run in any RKHS. Unlike previous margin-based semisupervised algorithms, our sampling condition hinges on a simultaneous upper bound on bias and variance of the RLS estimate under a simple linear label noise model. This fact allows us to prove performance bounds that hold for an arbitrary sequence of instances. In particular, we show that our sampling strategy approximates the margin of the Bayes optimal classifier to any desired accuracy ε by asking eÕ(d/ε2)queries (in the RKHS case d is replaced by a suitable spectral quantity). While these are the standard rates in the fully supervised i.i.d. case, the best previously known result in our harder setting was eÕ(d3/ε 4). Preliminary experiments show that some of our algorithms also exhibit a good practical performance. Copyright 2009.
Original languageEnglish (US)
Title of host publicationACM International Conference Proceeding Series
DOIs
StatePublished - Sep 15 2009
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2023-09-25

Fingerprint

Dive into the research topics of 'Robust bounds for classification via selective sampling'. Together they form a unique fingerprint.

Cite this