正规化(语言学)
计算机科学
分割
一致性(知识库)
人工智能
机器学习
蒸馏
特征(语言学)
模式识别(心理学)
语言学
化学
哲学
有机化学
作者
Jianlong Yuan,Jinchao Ge,Zhibin Wang,Yifan Liu
标识
DOI:10.1145/3581783.3611906
摘要
Consistency regularization has been widely studied in recent semi- supervised semantic segmentation methods, and promising per- formance has been achieved. In this work, we propose a new con- sistency regularization framework, termed mutual knowledge dis- tillation (MKD), combined with data and feature augmentation. We introduce two auxiliary mean-teacher models based on consis- tency regularization. More specifically, we use the pseudo-labels generated by a mean teacher to supervise the student network to achieve a mutual knowledge distillation between the two branches. In addition to using image-level strong and weak augmentation, we also discuss feature augmentation. This involves considering various sources of knowledge to distill the student network. Thus, we can significantly increase the diversity of the training samples. Experiments on public benchmarks show that our framework out- performs previous state-of-the-art (SOTA) methods under various semi-supervised settings. Code is available at https://github.com/jianlong-yuan/semi-mmseg.
科研通智能强力驱动
Strongly Powered by AbleSci AI