Abstract
Optimization problems under affine constraints appear in various areas of machine learning. We consider the task of minimizing a smooth strongly convex function F(x) under the affine constraint Kx = b, with an oracle providing evaluations of the gradient of F and multiplications by K and its transpose. We provide lower bounds on the number of gradient computations and matrix multiplications to achieve a given accuracy. Then we propose an accelerated primal-dual algorithm achieving these lower bounds. Our algorithm is the first optimal algorithm for this class of problems.
Original language | English (US) |
---|---|
Title of host publication | The 25th International Conference on Artificial Intelligence and Statistics |
Publisher | arXiv |
State | Published - 2022 |
Bibliographical note
KAUST Repository Item: Exported on 2022-09-14Acknowledgements: AS was supported by KAUST and by a Simons-Berkeley Research Fellowship.