Faster Rates of Private Stochastic Convex Optimization

Jinyan Su, Lijie Hu, Di Wang

Research output: Contribution to conferencePaperpeer-review

4 Scopus citations

Abstract

In this paper, we revisit the problem of Differentially Private Stochastic Convex Optimization (DP-SCO) and provide excess population risks for some special classes of functions that are faster than the previous results of general convex and strongly convex functions. In the first part of the paper, we study the case where the population risk function satisfies the Tysbakov Noise Condition (TNC) with some parameter θ > 1. Specifically, we first show that under some mild assumptions on the loss functions, there is an algorithm whose output could achieve an upper bound of (Equation presented) and (Equation presented) for ϵ-DP and (ϵ, δ)-DP, respectively when θ ≥ 2, here n is the sample size and d is the dimension of the space. Then we address the inefficiency issue, improve the upper bounds by Poly(log n) factors and extend to the case where θ ≥ θ̄ > 1 for some known θ̄. Next we show that the excess population risk of population functions satisfying TNC with parameter θ ≥ 2 is always lower bounded by (Equation presented) for ϵ-DP and (ϵ, δ)-DP, respectively, which matches our upper bounds. In the second part, we focus on a special case where the population risk function is strongly convex. Unlike the previous studies, here we assume the loss function is non-negative and the optimal value of population risk is sufficiently small. With these additional assumptions, we propose a new method whose output could achieve an upper bound of (Equation presented) and (Equation presented) for any τ > 1 in (ϵ, δ)-DP and ϵ-DP model respectively if the sample size n is sufficiently large. These results circumvent their corresponding lower bounds in (Feldman et al., 2020) for general strongly convex functions. Finally, we conduct experiments of our new methods on real world data. Experimental results also provide new insights into established theories.

Original languageEnglish (US)
Pages995-1002
Number of pages8
StatePublished - 2022
Event33rd International Conference on Algorithmic Learning Theory, ALT 2022 - Virtual, Online, France
Duration: Mar 29 2022Apr 1 2022

Conference

Conference33rd International Conference on Algorithmic Learning Theory, ALT 2022
Country/TerritoryFrance
CityVirtual, Online
Period03/29/2204/1/22

Bibliographical note

Publisher Copyright:
© 2022 J. Su, L. Hu & D. Wang.

Keywords

  • Differential Privacy
  • Stochastic Convex Optimization

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Faster Rates of Private Stochastic Convex Optimization'. Together they form a unique fingerprint.

Cite this