Stochastic block BFGS: Squeezing more curvature out of data

Robert M. Gower, Donald Goldfarb, Peter Richtárik

Research output: Chapter in Book/Report/Conference proceedingConference contribution

32 Scopus citations

Abstract

We propose a novel limited-memory stochastic block BFGS update for incorporating enriched curvature information in stochastic approximation methods. In our method, the estimate of the inverse Hessian matrix that is maintained by it, is updated at each iteration using a sketch of the Hessian, i.e., a randomly generated compressed form of the Hessian. We propose several sketching strategies, present a new quasi-Newton method that uses stochastic block BFGS updates combined with the variance reduction approach SVRG to compute batch stochastic gradients, and prove linear convergence of the resulting method. Numerical tests on large-scale logistic regression problems reveal that our method is more robust and substantially outperforms current state-of-the-art methods.
Original languageEnglish (US)
Title of host publication33rd International Conference on Machine Learning, ICML 2016
PublisherInternational Machine Learning Society (IMLS)[email protected]
Pages2774-2783
Number of pages10
ISBN (Print)9781510829008
StatePublished - Jan 1 2016

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2023-09-25

Fingerprint

Dive into the research topics of 'Stochastic block BFGS: Squeezing more curvature out of data'. Together they form a unique fingerprint.

Cite this