We propose an accelerated version of stochastic variance reduced coordinate descent – ASVRCD. As other variance reduced coordinate descent methods such as SEGA or SVRCD, our method can deal with problems that include a non-separable and non-smooth regularizer, while accessing a random block of partial derivatives in each iteration only. However, ASVRCD incorporates Nesterov’s momentum, which offers favorable iteration complexity guarantees over both SEGA and SVRCD. As a by-product of our theory, we show that a variant of Katyusha  is a specific case of ASVRCD, recovering the optimal oracle complexity for the finite sum objective
|Original language||English (US)|
|Title of host publication||International Conference on Machine Learning (ICML)|
|State||Published - 2020|
Bibliographical noteKAUST Repository Item: Exported on 2021-09-16
Acknowledgements: The authors would like to express their gratitude to Konstantin Mishchenko. In particular, Konstantin has introduced us to the product space objective (13) and at the same time, the idea that SAGA is a special case of SEGA was born during the discussion with him.