## Abstract

We propose a novel stochastic gradient method—semi-stochastic coordinate descent—for the problem of minimizing a strongly convex function represented as the average of a large number of smooth convex functions: f(X)= (1/n)∑_{i}f_{i}(x). Our method first performs a deterministic step (computation of the gradient of f at the starting point), followed by a large number of stochastic steps. The process is repeated a few times, with the last stochastic iterate becoming the new starting point where the deterministic step is taken. The novelty of our method is in how the stochastic steps are performed. In each such step, we pick a random function f_{i} and a random coordinate j—both using non-uniform distributions—and update a single coordinate of the decision vector only, based on the computation of the jth partial derivative of f_{i} at two different points. Each random step of the method constitutes an unbiased estimate of the gradient of f and moreover, the squared norm of the steps goes to zero in expectation, meaning that the stochastic estimate of the gradient progressively improves. The computational complexity of the method is the sum of two terms: O(nlog(1/ϵ)) evaluations of gradients ▽f_{i} and (Formula presented.) evaluations of partial derivatives ▽_{j}f_{i}, where (Formula presented.) is a novel condition number.

Original language | English (US) |
---|---|

Pages (from-to) | 993-1005 |

Number of pages | 13 |

Journal | Optimization Methods and Software |

Volume | 32 |

Issue number | 5 |

DOIs | |

State | Published - Sep 3 2017 |

### Bibliographical note

Funding Information:This work was supported by Engineering and Physical Sciences Research Council [grant number EP/K02325X/1].

Publisher Copyright:

© 2017 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.

## Keywords

- Stochastic gradient
- coordinate descent
- empirical risk minimization

## ASJC Scopus subject areas

- Software
- Control and Optimization
- Applied Mathematics