Back to Search Start Over

Look Back for More: Harnessing Historical Sequential Updates for Personalized Federated Adapter Tuning

Authors :
Peng, Danni
Wang, Yuan
Fu, Huazhu
Jiang, Jinpeng
Liu, Yong
Goh, Rick Siow Mong
Wei, Qingsong
Publication Year :
2025

Abstract

Personalized federated learning (PFL) studies effective model personalization to address the data heterogeneity issue among clients in traditional federated learning (FL). Existing PFL approaches mainly generate personalized models by relying solely on the clients' latest updated models while ignoring their previous updates, which may result in suboptimal personalized model learning. To bridge this gap, we propose a novel framework termed pFedSeq, designed for personalizing adapters to fine-tune a foundation model in FL. In pFedSeq, the server maintains and trains a sequential learner, which processes a sequence of past adapter updates from clients and generates calibrations for personalized adapters. To effectively capture the cross-client and cross-step relations hidden in previous updates and generate high-performing personalized adapters, pFedSeq adopts the powerful selective state space model (SSM) as the architecture of sequential learner. Through extensive experiments on four public benchmark datasets, we demonstrate the superiority of pFedSeq over state-of-the-art PFL methods.<br />Comment: Accepted by AAAI 2025

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2501.01653
Document Type :
Working Paper