STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression

计算机科学 判别式 过程(计算) 编码(集合论) 方案(数学) 机器学习 特征(语言学) 人工智能 数学分析 语言学 哲学 数学 集合(抽象数据类型) 程序设计语言 操作系统
作者
Tongtong Su,Jinsong Zhang,Zou Yu,Gang Wang,Xiaoguang Liu
出处
期刊:IEEE transactions on neural networks and learning systems [Institute of Electrical and Electronics Engineers]
卷期号:34 (12): 10051-10064 被引量:8
标识
DOI:10.1109/tnnls.2022.3164264
摘要

Knowledge distillation (KD) transfers discriminative knowledge from a large and complex model (known as teacher) to a smaller and faster one (known as student). Existing advanced KD methods, limited to fixed feature extraction paradigms that capture teacher's structure knowledge to guide the training of the student, often fail to obtain comprehensive knowledge to the student. Toward this end, in this article, we propose a new approach, synchronous teaching knowledge distillation (STKD), to integrate online teaching and offline teaching for transferring rich and comprehensive knowledge to the student. In the online learning stage, a blockwise unit is designed to distill the intermediate-level knowledge and high-level knowledge, which can achieve bidirectional guidance of the teacher and student networks. Intermediate-level information interaction provides more supervisory information to the student network and is useful to enhance the quality of final predictions. In the offline learning stage, the STKD approach applies a pretrained teacher to further improve the performance and accelerate the training process by providing prior knowledge. Trained simultaneously, the student learns multilevel and comprehensive knowledge by incorporating online teaching and offline teaching, which combines the advantages of different KD strategies through our STKD method. Experimental results on the SVHN, CIFAR-10, CIFAR-100, and ImageNet ILSVRC 2012 real-world datasets show that the proposed method achieves significant performance improvements compared with the state-of-the-art methods, especially with satisfying accuracy and model size. Code for STKD is provided at https://github.com/nanxiaotong/STKD.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
英俊的铭应助坚定的初柔采纳,获得10
1秒前
王献杰完成签到,获得积分10
1秒前
2秒前
二十贰发布了新的文献求助10
3秒前
英勇睫毛膏完成签到,获得积分10
5秒前
zyxhaian应助yush采纳,获得20
6秒前
奇异喵发布了新的文献求助10
6秒前
kinkrit发布了新的文献求助10
7秒前
7秒前
不变皆旗完成签到,获得积分10
7秒前
8秒前
9秒前
钢铁之心完成签到,获得积分10
9秒前
11秒前
14秒前
orixero应助红烧鼠蹄采纳,获得10
14秒前
14秒前
乐乐应助坚定的初柔采纳,获得10
15秒前
15秒前
kinkrit完成签到,获得积分10
16秒前
16秒前
17秒前
棠橦发布了新的文献求助30
18秒前
wanganjing发布了新的文献求助10
18秒前
18秒前
yryzst9899发布了新的文献求助10
19秒前
19秒前
钢铁之心发布了新的文献求助10
20秒前
Mike001发布了新的文献求助10
21秒前
Mike001发布了新的文献求助10
22秒前
奈何发布了新的文献求助10
24秒前
芒果彩虹猪完成签到,获得积分10
27秒前
28秒前
30秒前
万叶完成签到 ,获得积分10
30秒前
wj完成签到,获得积分20
31秒前
段段发布了新的文献求助10
32秒前
帮帮我发布了新的文献求助10
33秒前
33秒前
棠橦发布了新的文献求助10
33秒前
高分求助中
The three stars each: the Astrolabes and related texts 1100
Sport in der Antike 800
De arte gymnastica. The art of gymnastics 600
Berns Ziesemer - Maos deutscher Topagent: Wie China die Bundesrepublik eroberte 500
Stephen R. Mackinnon - Chen Hansheng: China’s Last Romantic Revolutionary (2023) 500
Sport in der Antike Hardcover – March 1, 2015 500
Psychological Warfare Operations at Lower Echelons in the Eighth Army, July 1952 – July 1953 400
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2429695
求助须知:如何正确求助?哪些是违规求助? 2114348
关于积分的说明 5361269
捐赠科研通 1842228
什么是DOI,文献DOI怎么找? 916872
版权声明 561496
科研通“疑难数据库(出版商)”最低求助积分说明 490448