计算机科学
特征(语言学)
蒸馏
色谱法
化学
哲学
语言学
作者
Xinlei Huang,Ning Jiang,Jialiang Tang,Wenqing Wu
出处
期刊:Communications in computer and information science
日期:2023-11-30
卷期号:: 402-413
标识
DOI:10.1007/978-981-99-8178-6_31
摘要
Feature-based knowledge distillation utilizes features from superior and complex teacher networks as knowledge to help portable student networks improve their generalization capability. Recent feature distillation algorithms focus on various feature processing and transmission methods while ignoring the flexibility of feature selection, resulting in limited distillation effects for students. In this paper, we propose Dynamic Feature Distillation to increase the flexibility of feature distillation by dynamically managing feature transfer sites. Our method leverages Online Feature Estimation to monitor the learning status of the student network in the feature dimension. Adaptive Position Selection then dynamically updates valuable feature transmission locations for efficient feature transmission. Notably, our approach can be easily integrated as a strategy for feature management into other feature-based knowledge transfer methods to improve their performance. We conduct extensive experiments on the CIFAR-100 and Tiny-ImageNet datasets to validate the effectiveness of Dynamic Feature Distillation.
科研通智能强力驱动
Strongly Powered by AbleSci AI