适应(眼睛)
计算机科学
考试(生物学)
人机交互
分布式计算
古生物学
物理
光学
生物
作者
Jiayuan Zhang,Xuefeng Liu,Yukang Zhang,Guogang Zhu,Jianwei Niu,Shaojie Tang
标识
DOI:10.1145/3637528.3671908
摘要
Deep learning models often suffer performance degradation when test data diverges from training data. Test-Time Adaptation (TTA) aims to adapt a trained model to the test data distribution using unlabeled test data streams. In many real-world applications, it is quite common for the trained model to be deployed across multiple devices simultaneously. Although each device can execute TTA independently, it fails to leverage information from the test data of other devices. To address this problem, we introduce Federated Learning (FL) to TTA to facilitate on-the-fly collaboration among devices during test time. The workflow involves clients (i.e., the devices) executing TTA locally, uploading their updated models to a central server for aggregation, and downloading the aggregated model for inference. However, implementing FL in TTA presents many challenges, especially in establishing inter-client collaboration in dynamic environment, where the test data distribution on different clients changes over time in different manners. To tackle these challenges, we propose a server-side Temporal-Spatial Aggregation (TSA) method. TSA utilizes a temporal-spatial attention module to capture intra-client temporal correlations and inter-client spatial correlations. To further improve robustness against temporal-spatial heterogeneity, we propose a heterogeneity-aware augmentation method and optimize the module using a self-supervised approach. More importantly, TSA can be implemented as a plug-in to TTA methods in distributed environments. Experiments on multiple datasets demonstrate that TSA outperforms existing methods and exhibits robustness across various levels of heterogeneity. The code is available at https://github.com/ZhangJiayuan-BUAA/FedTSA.
科研通智能强力驱动
Strongly Powered by AbleSci AI