Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems

Filip Hanzely, Dmitry Koyaley, Peter Richtarik

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

We propose an accelerated version of stochastic variance reduced coordinate descent – ASVRCD. As other variance reduced coordinate descent methods such as SEGA or SVRCD, our method can deal with problems that include a non-separable and non-smooth regularizer, while accessing a random block of partial derivatives in each iteration only. However, ASVRCD incorporates Nesterov’s momentum, which offers favorable iteration complexity guarantees over both SEGA and SVRCD. As a by-product of our theory, we show that a variant of Katyusha [1] is a specific case of ASVRCD, recovering the optimal oracle complexity for the finite sum objective
Original languageEnglish (US)
Title of host publicationInternational Conference on Machine Learning (ICML)
PublisherarXiv
StatePublished - 2020

Bibliographical note

KAUST Repository Item: Exported on 2021-09-16
Acknowledgements: The authors would like to express their gratitude to Konstantin Mishchenko. In particular, Konstantin has introduced us to the product space objective (13) and at the same time, the idea that SAGA is a special case of SEGA was born during the discussion with him.

Fingerprint

Dive into the research topics of 'Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems'. Together they form a unique fingerprint.

Cite this