Unifying Framework for Accelerated Randomized Methods in Convex Optimization

Pavel Dvurechensky, Alexander Gasnikov, Alexander Tyurin, Vladimir Zholobov

Research output: Chapter in Book/Report/Conference proceedingConference contribution


In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on accelerated random block-coordinate descent, accelerated random directional search, accelerated random derivative-free method and, using our framework, provide their versions for problems with inexact oracle information. Our contribution also includes accelerated random block-coordinate descent with inexact oracle and entropy proximal setup as well as derivative-free version of this method. Moreover, we present an extension of our framework for strongly convex optimization problems. We also discuss an extension for the case of inexact model of the objective function.
Original languageEnglish (US)
Title of host publicationFoundations of Modern Statistics
PublisherSpringer International Publishing
Number of pages51
ISBN (Print)9783031301131
StatePublished - Jul 17 2023

Bibliographical note

KAUST Repository Item: Exported on 2023-09-05
Acknowledgements: The authors are very grateful to Yu. Nesterov and V. Spokoiny for fruitful discussions. Our interest to this field was initiated by the paper [48]. The research was supported by the Ministry of Science and Higher Education of the Russian Federation (Goszadaniye) No. 075-00337-20-03, project No. 0714-2020-0005.


Dive into the research topics of 'Unifying Framework for Accelerated Randomized Methods in Convex Optimization'. Together they form a unique fingerprint.

Cite this