We study the problem of Differentially Private Stochastic Convex Optimization (DP-SCO) with heavy-tailed data. Specifically, we focus on the ℓ1-norm linear regression in the ϵ-DP model. While most of the previous work focuses on the case where the loss function is Lipschitz, here we only need to assume the variates has bounded moments. Firstly, we study the case where the ℓ2 norm of data has bounded second order moment. We propose an algorithm which is based on the exponential mechanism and show that it is possible to achieve an upper bound of O~(dnε−−√) (with high probability). Next, we relax the assumption to bounded θ-th order moment with some θ ∈ (1,2) and show that it is possible to achieve an upper bound of O~((dnε−−√)θ−1θ). Our algorithms can also be extended to more relaxed cases where only each coordinate of the data has bounded moments, and we can get an upper bound of O~(dnε−−√) and O~(d(nε)θ−1θ) in the second and θ-th moment case respectively.
|Original language||English (US)|
|Title of host publication||2022 IEEE International Symposium on Information Theory (ISIT)|
|State||Published - Aug 3 2022|
Bibliographical noteKAUST Repository Item: Exported on 2022-09-14
Acknowledged KAUST grant number(s): BAS/1/1689-01-01, REI/1/4811-10-01, URF/1/4663-01-01
Acknowledgements: Di Wang was support in part by the baseline funding BAS/1/1689-01-01, funding from the CRG grand URF/1/4663-01-01 and funding from the AI Initiative REI/1/4811-10-01 of King Abdullah University of Science and Technology (KAUST). Jinhui Xu was support in part by the National Science Foundation IIS-1910492 and funding from the CRG grand URF/1/4663-01-01 of King Abdullah University of Science and Technology (KAUST).