Finite Sample Guarantees of Differentially Private Expectation Maximization Algorithm

Di Wang, Jiahao Ding, Lijie Hu, Zejun Xie, Miao Pan, Jinhui Xu

Research output: Contribution to journalArticlepeer-review


(Gradient) Expectation Maximization (EM) is a widely used algorithm for estimating the maximum likelihood of mixture models or incomplete data problems. A major challenge facing this popular technique is how to effectively preserve the privacy of sensitive data. Previous research on this problem has already lead to the discovery of some Differentially Private (DP) algorithms for (Gradient) EM. However, unlike in the non-private case, existing techniques are not yet able to provide finite sample statistical guarantees. To address this issue, we propose in this paper the first DP version of Gradient EM algorithm with statistical guarantees. Specifically, we first propose a new mechanism for privately estimating the mean of a heavy-tailed distribution, which significantly improves a previous result in [25], and it could be extended to the local DP model, which has not been studied before. Next, we apply our general framework to three canonical models: Gaussian Mixture Model (GMM), Mixture of Regressions Model (MRM) and Linear Regression with Missing Covariates (RMC). Specifically, for GMM in the DP model, our estimation error is near optimal in some cases. For the other two models, we provide the first result on finite sample statistical guarantees. Our theory is supported by thorough numerical experiments on both real-world data and synthetic data.
Original languageEnglish (US)
JournalFrontiers in Artificial Intelligence and Applications
StatePublished - Sep 28 2023

Bibliographical note

KAUST Repository Item: Exported on 2023-10-02
Acknowledged KAUST grant number(s): BAS/1/1689-01-01, CRG10, FCC/1/1976-49-01, REI/1/4811-10-01, URF/1/4663-01- 01, CRG10-4663.2
Acknowledgements: Di Wang and Lijie Hu were supported in part by the baseline funding BAS/1/1689-01-01, funding from the CRG grand URF/1/4663-01- 01, FCC/1/1976-49-01 from CBRC, and funding from the AI Initiative REI/1/4811-10-01 of King Abdullah University of Science and Technology (KAUST). Di Wang was also supported by the funding of the SDAIA-KAUST Center of Excellence in Data Science and Artificial Intelligence (SDAIA-KAUST AI). The research of the last author was supported in part by NSF through grants IIS-1910492 and CCF-2200173 and by KAUST through grant CRG10-4663.2.

ASJC Scopus subject areas

  • Artificial Intelligence


Dive into the research topics of 'Finite Sample Guarantees of Differentially Private Expectation Maximization Algorithm'. Together they form a unique fingerprint.

Cite this