计算机科学
判别式
过程(计算)
编码(集合论)
方案(数学)
机器学习
特征(语言学)
人工智能
数学分析
语言学
哲学
数学
集合(抽象数据类型)
程序设计语言
操作系统
作者
Tongtong Su,Jinsong Zhang,Zou Yu,Gang Wang,Xiaoguang Liu
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2023-12-01
卷期号:34 (12): 10051-10064
被引量:8
标识
DOI:10.1109/tnnls.2022.3164264
摘要
Knowledge distillation (KD) transfers discriminative knowledge from a large and complex model (known as teacher) to a smaller and faster one (known as student). Existing advanced KD methods, limited to fixed feature extraction paradigms that capture teacher's structure knowledge to guide the training of the student, often fail to obtain comprehensive knowledge to the student. Toward this end, in this article, we propose a new approach, synchronous teaching knowledge distillation (STKD), to integrate online teaching and offline teaching for transferring rich and comprehensive knowledge to the student. In the online learning stage, a blockwise unit is designed to distill the intermediate-level knowledge and high-level knowledge, which can achieve bidirectional guidance of the teacher and student networks. Intermediate-level information interaction provides more supervisory information to the student network and is useful to enhance the quality of final predictions. In the offline learning stage, the STKD approach applies a pretrained teacher to further improve the performance and accelerate the training process by providing prior knowledge. Trained simultaneously, the student learns multilevel and comprehensive knowledge by incorporating online teaching and offline teaching, which combines the advantages of different KD strategies through our STKD method. Experimental results on the SVHN, CIFAR-10, CIFAR-100, and ImageNet ILSVRC 2012 real-world datasets show that the proposed method achieves significant performance improvements compared with the state-of-the-art methods, especially with satisfying accuracy and model size. Code for STKD is provided at https://github.com/nanxiaotong/STKD.
科研通智能强力驱动
Strongly Powered by AbleSci AI