Abstract
Privacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.
Original language | English (US) |
---|---|
Title of host publication | Database Systems for Advanced Applications |
Publisher | Springer Nature |
Pages | 576-591 |
Number of pages | 16 |
ISBN (Print) | 9783319557526 |
DOIs | |
State | Published - Mar 22 2017 |
Bibliographical note
KAUST Repository Item: Exported on 2020-10-01Acknowledgements: This work was done while the first author was a visiting student at King Abdullah University of Science and Technology (KAUST). Research reported in this publication was partially supported by KAUST and Natural Science Foundation of China (Grant Nos. 61572336, 61572335, 61632016, 61402313).