Back to Search Start Over

Towards optimal learning: Investigating the impact of different model updating strategies in federated learning.

Authors :
Ilić, Mihailo
Ivanović, Mirjana
Kurbalija, Vladimir
Valachis, Antonios
Source :
Expert Systems with Applications. Sep2024:Part A, Vol. 249, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

With rising data security concerns, privacy preserving machine learning (ML) methods have become a key research topic. Federated learning (FL) is one such approach which has gained a lot of attention recently as it offers greater data security in ML tasks. Substantial research has already been done on different aggregation methods, personalized FL algorithms etc. However, insufficient work has been done to identify the effects different model update strategies (concurrent FL, incremental FL, etc.) have on federated model performance. This paper presents results of extensive FL simulations run on multiple datasets with different conditions in order to determine the efficiency of 4 different FL model update strategies: concurrent, semi-concurrent, incremental, and cyclic-incremental. We have found that incremental updating methods offer more reliable FL models in cases where data is distributed both evenly and unevenly between edge nodes, especially when the number of data samples across all edge nodes is small. • Federated learning model performance is influenced by multiple environmental factors. • Different update strategies react differently to dataset size and participant number. • Update strategies can be categorized into incremental and concurrent. • Strategy selection can help inter-institutional collaboration achieve better results. • Incremental modes are more robust to lack of data. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
249
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
176811283
Full Text :
https://doi.org/10.1016/j.eswa.2024.123553