Abstract
We propose mS2GD: a method incorporating a mini-batching scheme for improving the theoretical complexity and practical performance of semi-stochastic gradient descent (S2GD). We consider the problem of minimizing a strongly convex function represented as the sum of an average of a large number of smooth convex functions, and a simple nonsmooth convex regularizer. Our method first performs a deterministic step (computation of the gradient of the objective function at the starting point), followed by a large number of stochastic steps. The process is repeated a few times with the last iterate becoming the new starting point. The novelty of our method is in introduction of mini-batching into the computation of stochastic steps. In each step, instead of choosing a single function, we sample b functions, compute their gradients, and compute the direction based on this. We analyze the complexity of the method and show that it benefits from two speedup effects. First, we prove that as long as b is below a certain threshold, we can reach any predefined accuracy with less overall work than without mini-batching. Second, our mini-batching scheme admits a simple parallel implementation, and hence is suitable for further acceleration by parallelization.
Original language | English (US) |
---|---|
Article number | 7347336 |
Pages (from-to) | 242-255 |
Number of pages | 14 |
Journal | IEEE Journal on Selected Topics in Signal Processing |
Volume | 10 |
Issue number | 2 |
DOIs | |
State | Published - Mar 2016 |
Externally published | Yes |
Bibliographical note
Funding Information:The work of J. Konečný was supported by a Google Doctoral Fellowship in Optimization Algorithms. The work of J. Liu was supported by a Gotshall Fellowship from Lehigh University, Bethlehem, PA, USA. The work of P. Richtárik was supported by the Engineering and Physical Sciences Research Council (EPSRC) under Grant EP/K02325X/1, "Accelerated Coordinate Descent Methods for Big Data Optimization."
Publisher Copyright:
© 2015 IEEE.
Keywords
- Empirical risk minimization
- mini-batches
- proximal methods
- semi-stochastic gradient descent
- sparse data
- stochastic gradient descent
- variance reduction
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering