1. An Experimental Survey of Incremental Transfer Learning for Multicenter Collaboration
- Author
-
Yixing Huang, Christoph Bert, Ahmed Gomaa, Rainer Fietkau, Andreas Maier, and Florian Putz
- Subjects
Continual learning ,multicenter collaboration ,federated learning ,data privacy ,deep learning ,peer-to-peer ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Due to data privacy constraints, data sharing among multiple clinical centers is restricted, which impedes the development of high performance deep learning models from multicenter collaboration. Naive weight transfer methods share intermediate model weights without raw data and hence can bypass data privacy restrictions. However, performance drops are typically observed when the model is transferred from one center to the next because of the forgetting problem. Incremental transfer learning, which combines peer-to-peer federated learning and domain incremental learning, can overcome the data privacy issue and meanwhile preserve model performance by using continual learning techniques. In this work, a conventional domain/task incremental learning framework is adapted for incremental transfer learning. A survey on the efficacy of prevalent regularization-based continual learning methods for multicenter collaboration is performed. The influences of data heterogeneity, classifier head setting, network optimizer, model initialization, center order, and weight transfer type have been investigated thoroughly. Our framework is publicly accessible to the research community for further development.
- Published
- 2024
- Full Text
- View/download PDF