1. Federated Learning for Predicting Postoperative Remission of Patients with Acromegaly: A Multicentered Study.
- Author
-
Zhang, Wentai, Wu, Xueyang, Wang, He, Wu, Ruopei, Deng, Congcong, Xu, Qian, Liu, Xiaohai, Bai, Xuexue, Yang, Shuangjian, Li, Xiaoxu, Feng, Ming, Yang, Qiang, and Wang, Renzhi
- Subjects
- *
ARTIFICIAL neural networks , *FEDERATED learning , *RECEIVER operating characteristic curves , *MACHINE learning , *SUPPORT vector machines - Abstract
Decentralized federated learning (DFL) may serve as a useful framework for machine learning (ML) tasks in multicentered studies, maximizing the use of clinical data without data sharing. We aim to propose the first workflow of DFL for ML tasks in multicentered studies, which can be as powerful as those using centralized data. A DFL workflow was developed with 4 steps: registration, local computation, model update, and inspection. A total of 598 participants with acromegaly from Peking Union Medical College Hospital, and 120 participants from Xuanwu Hospital were enrolled. The cohort from Peking Union Medical College Hospital was further split into 5 centers. Nine clinical features were incorporated into ML-based models trained based on 4 algorithms: logistic regression (LR), gradient boosted decision tree, support vector machine (SVM), and deep neural network (DNN). The area under the curve of receiver operating characteristic curves was used to evaluate the performance of the models. Models trained based on DFL workflow performed better than most models in LR (P < 0.05), all models in DNN, SVM, and gradient boosted decision tree (P < 0.05). Models trained on DFL workflow performed as powerful as models trained on centralized data in LR, DNN, and SVM (P > 0.05). We demonstrate that the DFL workflow without data sharing should be a more appropriate method in ML tasks in multicentered studies. And the DFL workflow should be further exploited in clinical researches in other departments and it can encourage and facilitate multicentered studies. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF