计算机科学
适应性
域适应
适应(眼睛)
领域(数学分析)
分割
一般化
人工智能
源代码
编码(集合论)
试验数据
机器学习
算法
模式识别(心理学)
数学
软件工程
生态学
数学分析
物理
集合(抽象数据类型)
分类器(UML)
光学
生物
程序设计语言
操作系统
作者
Jiayi Zhu,Bart Bolsterlee,Brian V. Y. Chow,Yang Song,Erik Meijering
标识
DOI:10.1007/978-3-031-43898-1_63
摘要
Continual test-time adaptation (CTTA) aims to continuously adapt a source-trained model to a target domain with minimal performance loss while assuming no access to the source data. Typically, source models are trained with empirical risk minimization (ERM) and assumed to perform reasonably on the target domain to allow for further adaptation. However, ERM-trained models often fail to perform adequately on a severely drifted target domain, resulting in unsatisfactory adaptation results. To tackle this issue, we propose a generalizable CTTA framework. First, we incorporate domain-invariant shape modeling into the model and train it using domain-generalization (DG) techniques, promoting target-domain adaptability regardless of the severity of the domain shift. Then, an uncertainty and shape-aware mean teacher network performs adaptation with uncertainty-weighted pseudo-labels and shape information. Lastly, small portions of the model's weights are stochastically reset to the initial domain-generalized state at each adaptation step, preventing the model from 'diving too deep' into any specific test samples. The proposed method demonstrates strong continual adaptability and outperforms its peers on three cross-domain segmentation tasks. Code is available at https://github.com/ThisGame42/CTTA .
科研通智能强力驱动
Strongly Powered by AbleSci AI