Large scale constrained linear regression revisited: Faster algorithms via preconditioning

Di Wang, Jinhui Xu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations


In this paper, we revisit the large-scale constrained linear regression problem and propose faster methods based on some recent developments in sketching and optimization. Our algorithms combine (accelerated) mini-batch SGD with a new method called two-step preconditioning to achieve an approximate solution with a time complexity lower than that of the state-of-the-art techniques for the low precision case. Our idea can also be extended to the high precision case, which gives an alternative implementation to the Iterative Hessian Sketch (IHS) method with significantly improved time complexity. Experiments on benchmark and synthetic datasets suggest that our methods indeed outperform existing ones considerably in both the low and high precision cases.
Original languageEnglish (US)
Title of host publication32nd AAAI Conference on Artificial Intelligence, AAAI 2018
PublisherAAAI press
Number of pages8
ISBN (Print)9781577358008
StatePublished - Jan 1 2018
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2022-09-15


Dive into the research topics of 'Large scale constrained linear regression revisited: Faster algorithms via preconditioning'. Together they form a unique fingerprint.

Cite this