Abstract
We propose a generic variance-reduced algorithm, which we call MUltiple RANdomized Algorithm (MURANA), for minimizing a sum of several smooth functions plus a regularizer, in a sequential or distributed manner. Our method is formulated with general stochastic operators, which allow us to model various strategies for reducing the computational complexity. For example, MURANA supports sparse activation of the gradients, and also reduction of the communication load via compression of the update vectors. This versatility allows MURANA to cover many existing randomization mechanisms within a unified framework, which also makes it possible to design new methods as special cases.
Original language | English (US) |
---|---|
Pages | 81-96 |
Number of pages | 16 |
State | Published - 2022 |
Event | 3rd Annual Conference on Mathematical and Scientific Machine Learning, MSML 2022 - Beijing, China Duration: Aug 15 2022 → Aug 17 2022 |
Conference
Conference | 3rd Annual Conference on Mathematical and Scientific Machine Learning, MSML 2022 |
---|---|
Country/Territory | China |
City | Beijing |
Period | 08/15/22 → 08/17/22 |
Bibliographical note
Publisher Copyright:© 2022 L. Condat & P. Richtárik.
Keywords
- communication
- compression
- convex optimization
- distributed optimization
- randomized algorithm
- sampling
- stochastic gradient
- variance reduction
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability