Abstract
In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on accelerated random block-coordinate descent, accelerated random directional search, accelerated random derivative-free method and, using our framework, provide their versions for problems with inexact oracle information. Our contribution also includes accelerated random block-coordinate descent with inexact oracle and entropy proximal setup as well as derivative-free version of this method. Moreover, we present an extension of our framework for strongly convex optimization problems. We also discuss an extension for the case of inexact model of the objective function.
Original language | English (US) |
---|---|
Title of host publication | Foundations of Modern Statistics |
Publisher | Springer International Publishing |
Pages | 511-561 |
Number of pages | 51 |
ISBN (Print) | 9783031301131 |
DOIs | |
State | Published - Jul 17 2023 |
Bibliographical note
KAUST Repository Item: Exported on 2023-09-05Acknowledgements: The authors are very grateful to Yu. Nesterov and V. Spokoiny for fruitful discussions. Our interest to this field was initiated by the paper [48]. The research was supported by the Ministry of Science and Higher Education of the Russian Federation (Goszadaniye) No. 075-00337-20-03, project No. 0714-2020-0005.