Differentially private empirical risk minimization revisited: Faster and more general

Di Wang, Minwei Ye, Jinhui Xu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

163 Scopus citations

Abstract

In this paper we study the differentially private Empirical Risk Minimization (ERM) problem in different settings. For smooth (strongly) convex loss function with or without (non)-smooth regularization, we give algorithms that achieve either optimal or near optimal utility bounds with less gradient complexity compared with previous work. For ERM with smooth convex loss function in high-dimensional (p >> n) setting, we give an algorithm which achieves the upper bound with less gradient complexity than previous ones. At last, we generalize the expected excess empirical risk from convex loss functions to non-convex ones satisfying the Polyak-Lojasiewicz condition and give a tighter upper bound on the utility than the one in [34].
Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
Pages2723-2732
Number of pages10
StatePublished - Jan 1 2017
Externally publishedYes

Bibliographical note

Generated from Scopus record by KAUST IRTS on 2022-09-15

Fingerprint

Dive into the research topics of 'Differentially private empirical risk minimization revisited: Faster and more general'. Together they form a unique fingerprint.

Cite this