Black-Box Reductions for Parameter-free Online Learning in Banach Spaces

Ashok Cutkosky, Francesco Orabona

Research output: Chapter in Book/Report/Conference proceedingConference contribution

53 Scopus citations

Abstract

We introduce several new black-box reductions that significantly improve the design of adaptive and parameter-free online learning algorithms by simplifying analysis, improving regret guarantees, and sometimes even improving runtime. We reduce parameter-free online learning to online exp-concave optimization, we reduce optimization in a Banach space to one-dimensional optimization, and we reduce optimization over a constrained domain to unconstrained optimization. All of our reductions run as fast as online gradient descent. We use our new techniques to improve upon the previously best regret bounds for parameter-free learning, and do so for arbitrary norms.
Original languageEnglish (US)
Title of host publicationProceedings of Machine Learning Research
PublisherML Research Press
Pages1493-1529
Number of pages37
StatePublished - Jan 1 2018
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2023-09-25

Fingerprint

Dive into the research topics of 'Black-Box Reductions for Parameter-free Online Learning in Banach Spaces'. Together they form a unique fingerprint.

Cite this