Back to Search Start Over

ADAPT^2: Adapting Pre-Trained Sensing Models to End-Users via Self-Supervision Replay

Authors :
Yoon, Hyungjun
Kwak, Jaehyun
Tolera, Biniyam Aschalew
Dai, Gaole
Li, Mo
Gong, Taesik
Lee, Kimin
Lee, Sung-Ju
Publication Year :
2024

Abstract

Self-supervised learning has emerged as a method for utilizing massive unlabeled data for pre-training models, providing an effective feature extractor for various mobile sensing applications. However, when deployed to end-users, these models encounter significant domain shifts attributed to user diversity. We investigate the performance degradation that occurs when self-supervised models are fine-tuned in heterogeneous domains. To address the issue, we propose ADAPT^2, a few-shot domain adaptation framework for personalizing self-supervised models. ADAPT2 proposes self-supervised meta-learning for initial model pre-training, followed by a user-side model adaptation by replaying the self-supervision with user-specific data. This allows models to adjust their pre-trained representations to the user with only a few samples. Evaluation with four benchmarks demonstrates that ADAPT^2 outperforms existing baselines by an average F1-score of 8.8%p. Our on-device computational overhead analysis on a commodity off-the-shelf (COTS) smartphone shows that ADAPT2 completes adaptation within an unobtrusive latency (in three minutes) with only a 9.54% memory consumption, demonstrating the computational efficiency of the proposed method.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2404.15305
Document Type :
Working Paper