1. Towards optimal learning: Investigating the impact of different model updating strategies in federated learning.
- Author
-
Ilić, Mihailo, Ivanović, Mirjana, Kurbalija, Vladimir, and Valachis, Antonios
- Subjects
- *
FEDERATED learning , *LEARNING strategies , *DATA security , *DEEP learning , *ALGORITHMS - Abstract
With rising data security concerns, privacy preserving machine learning (ML) methods have become a key research topic. Federated learning (FL) is one such approach which has gained a lot of attention recently as it offers greater data security in ML tasks. Substantial research has already been done on different aggregation methods, personalized FL algorithms etc. However, insufficient work has been done to identify the effects different model update strategies (concurrent FL, incremental FL, etc.) have on federated model performance. This paper presents results of extensive FL simulations run on multiple datasets with different conditions in order to determine the efficiency of 4 different FL model update strategies: concurrent, semi-concurrent, incremental, and cyclic-incremental. We have found that incremental updating methods offer more reliable FL models in cases where data is distributed both evenly and unevenly between edge nodes, especially when the number of data samples across all edge nodes is small. • Federated learning model performance is influenced by multiple environmental factors. • Different update strategies react differently to dataset size and participant number. • Update strategies can be categorized into incremental and concurrent. • Strategy selection can help inter-institutional collaboration achieve better results. • Incremental modes are more robust to lack of data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF