ADAPT^2: Adapting Pre-Trained Sensing Models to End-Users via Self-Supervision Replay
arxiv(2024)
摘要
Self-supervised learning has emerged as a method for utilizing massive
unlabeled data for pre-training models, providing an effective feature
extractor for various mobile sensing applications. However, when deployed to
end-users, these models encounter significant domain shifts attributed to user
diversity. We investigate the performance degradation that occurs when
self-supervised models are fine-tuned in heterogeneous domains. To address the
issue, we propose ADAPT^2, a few-shot domain adaptation framework for
personalizing self-supervised models. ADAPT2 proposes self-supervised
meta-learning for initial model pre-training, followed by a user-side model
adaptation by replaying the self-supervision with user-specific data. This
allows models to adjust their pre-trained representations to the user with only
a few samples. Evaluation with four benchmarks demonstrates that ADAPT^2
outperforms existing baselines by an average F1-score of 8.8
computational overhead analysis on a commodity off-the-shelf (COTS) smartphone
shows that ADAPT2 completes adaptation within an unobtrusive latency (in three
minutes) with only a 9.54
efficiency of the proposed method.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要