Back to Search
Start Over
FedQUIT: On-Device Federated Unlearning via a Quasi-Competent Virtual Teacher
- Publication Year :
- 2024
-
Abstract
- Federated Learning (FL) promises better privacy guarantees for individuals' data when machine learning models are collaboratively trained. When an FL participant exercises its right to be forgotten, i.e., to detach from the FL framework it has participated and to remove its past contributions to the global model, the FL solution should perform all the necessary steps to make it possible without sacrificing the overall performance of the global model, which is not supported in state-of-the-art related solutions nowadays. In this paper, we propose FedQUIT, a novel algorithm that uses knowledge distillation to scrub the contribution of the forgetting data from an FL global model while preserving its generalization ability. FedQUIT directly works on clients' devices and does not require sharing additional information if compared with a regular FL process, nor does it assume the availability of publicly available proxy data. Our solution is efficient, effective, and applicable in both centralized and federated settings. Our experimental results show that, on average, FedQUIT requires less than 2.5% additional communication rounds to recover generalization performances after unlearning, obtaining a sanitized global model whose predictions are comparable to those of a global model that has never seen the data to be forgotten.<br />Comment: Submitted to The 39th Annual AAAI Conference on Artificial Intelligence (AAAI-25)
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2408.07587
- Document Type :
- Working Paper