Back to Search
Start Over
Dmaf: data-model anti-forgetting for federated incremental learning.
- Source :
-
Cluster Computing . Feb2025, Vol. 28 Issue 1, p1-16. 16p. - Publication Year :
- 2025
-
Abstract
- Federated Learning has received much attention due to its data privacy benefits, but most existing approaches assume that client classes are fixed. Clients may remove old classes and add new ones, leading to catastrophic forgetting of the model. Existing methods have limitations, such as requiring additional client storage and distillation methods becoming less effective as new classes increase. For this reason, this paper proposes the Data-Model Anti-Forgetting (DMAF) framework. Specifically, in the proposed framework, an auxiliary client and group aggregation method to mitigate catastrophic forgetting at the data level has been proposed, which does not require clients to allocate additional storage space to store synthetic data and can balance class distributions. A multi-teacher integrated knowledge distillation method was adopted to retain old class knowledge by distilling multiple teacher models and design task fusion for further tuning of the global model. Finally, this paper conducts extensive experiments on public datasets to validate the effectiveness of DMAF. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 13867857
- Volume :
- 28
- Issue :
- 1
- Database :
- Academic Search Index
- Journal :
- Cluster Computing
- Publication Type :
- Academic Journal
- Accession number :
- 180437680
- Full Text :
- https://doi.org/10.1007/s10586-024-04697-9