Goldfish: An Efficient Federated Unlearning Framework

Houzhe Wang, Xiaojie Zhu*, Chi Chen, Paulo Esteves-Verissimo

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

With recent legislation on the right to be forgotten, machine unlearning has emerged as a crucial research area. It facilitates the removal of a user's data from federated trained machine learning models without the necessity for retraining from scratch. However, current machine unlearning algorithms are confronted with challenges of efficiency and validity. To address the above issues, we propose a new framework, named Goldfish. It comprises four modules: basic model, loss function, optimization, and extension. To address the challenge of low validity in existing machine unlearning algorithms, we propose a novel loss function. It takes into account the loss arising from the discrepancy between predictions and actual labels in the remaining dataset. Simultaneously, it takes into consideration the bias of predicted results on the removed dataset. Moreover, it accounts for the confidence level of predicted results. Additionally, to enhance efficiency, we adopt knowledge a distillation technique in the basic model and introduce an optimization module that encompasses the early termination mechanism guided by empirical risk and the data partition mechanism. Furthermore, to bolster the robustness of the aggregated model, we propose an extension module that incorporates a mechanism using adaptive distillation temperature to address the heterogeneity of user local data and a mechanism using adaptive weight to handle the variety in the quality of uploaded models. Finally, we conduct comprehensive experiments to illustrate the effectiveness of proposed approach.

Original languageEnglish (US)
Title of host publicationProceedings - 2024 54th Annual IEEE/IFIP International Conference on Dependable Systems and Networks, DSN 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages252-264
Number of pages13
ISBN (Electronic)9798350341058
DOIs
StatePublished - 2024
Event54th Annual IEEE/IFIP International Conference on Dependable Systems and Networks, DSN 2024 - Brisbane, Australia
Duration: Jun 24 2024Jun 27 2024

Publication series

NameProceedings - 2024 54th Annual IEEE/IFIP International Conference on Dependable Systems and Networks, DSN 2024

Conference

Conference54th Annual IEEE/IFIP International Conference on Dependable Systems and Networks, DSN 2024
Country/TerritoryAustralia
CityBrisbane
Period06/24/2406/27/24

Bibliographical note

Publisher Copyright:
© 2024 IEEE.

Keywords

  • distillation model
  • efficient retraining
  • federated unlearning

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Hardware and Architecture
  • Information Systems
  • Safety, Risk, Reliability and Quality

Fingerprint

Dive into the research topics of 'Goldfish: An Efficient Federated Unlearning Framework'. Together they form a unique fingerprint.

Cite this