Improving Knowledge Distillation via Head and Tail Categories

计算机科学 蒸馏 主管(地质) 人工智能 自然语言处理 地质学 化学 地貌学 有机化学
作者
Liuchi Xu,Jin Ren,Zhenhua Huang,Wei‐Shi Zheng,Yunwen Chen
出处
期刊:IEEE Transactions on Circuits and Systems for Video Technology [Institute of Electrical and Electronics Engineers]
卷期号:34 (5): 3465-3480 被引量:6
标识
DOI:10.1109/tcsvt.2023.3325814
摘要

Knowledge distillation (KD) is a technique that transfers "dark knowledge" from a deep teacher network (teacher) to a shallow student network (student). Despite significant advances in KD, existing work has not adequately mined two crucial types of knowledge: 1) the knowledge of head categories, which represents the relationship between the target category and its similar categories. Our findings reveal that this highly similar (complex) knowledge is essential for improving student's performance; and 2) the effectively utilized knowledge of tail categories. Existing studies often treat the non-target categories collectively without sufficiently considering the effectiveness of knowledge from tail categories. To tackle these challenges, we reformulate classical KD (ReKD) into two components: Top- K Inter-class Similar Distillation (TISD) and Non-Top- K Inter-class Discriminability (NTID). Firstly, TISD captures and imparts the knowledge of head categories to the student. Our experimental results have verified that TISD is particularly effective in transferring the knowledge of head categories, even in fine-grained dataset classification. Secondly, we theoretically show that the weighting coefficient of NTID increases with the probability of Top- K , leading to stronger suppression of knowledge transfer for tail categories. This observation explains why difficult samples are more informative than simple ones. To better utilize both types of knowledge, we optimize both TISD and NTID using different weighting coefficients, thereby enhancing the student's ability to learn this valuable knowledge from both head and tail categories. Furthermore, our extensive experimental results demonstrate that ReKD achieves state-of-the-art performance on various image classification datasets, including CIFAR-100, Tiny-ImageNet, and ImageNet-1K, as well as object detection and instance segmentation using the MS-COCO dataset.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
mmddlj发布了新的文献求助10
1秒前
1秒前
1秒前
chens627发布了新的文献求助10
2秒前
浮游应助hzz采纳,获得10
2秒前
Leay完成签到,获得积分10
2秒前
2秒前
5秒前
Cavy完成签到 ,获得积分10
6秒前
WBTT发布了新的文献求助10
6秒前
catloaf发布了新的文献求助30
7秒前
8秒前
浮游应助Rabbit采纳,获得10
9秒前
ll发布了新的文献求助10
10秒前
丘比特应助26岁顶级保安采纳,获得10
11秒前
13秒前
小马甲应助suibiao采纳,获得10
14秒前
15秒前
15秒前
气945完成签到,获得积分10
16秒前
17秒前
韩大宝贝完成签到,获得积分10
17秒前
浮游应助墨染星辰采纳,获得10
17秒前
Lucas应助风味土豆片采纳,获得10
18秒前
18秒前
苹果大王完成签到,获得积分10
19秒前
小二郎应助WBTT采纳,获得10
20秒前
好眠哈密瓜完成签到 ,获得积分10
20秒前
23秒前
隐形曼青应助求求采纳,获得10
23秒前
Waiting发布了新的文献求助10
23秒前
25秒前
25秒前
25秒前
冷静的莞完成签到 ,获得积分0
25秒前
26秒前
爆米花应助小鹿采纳,获得10
26秒前
27秒前
赘婿应助Gleast采纳,获得10
27秒前
FF发布了新的文献求助10
28秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
NMR in Plants and Soils: New Developments in Time-domain NMR and Imaging 600
Electrochemistry: Volume 17 600
Physical Chemistry: How Chemistry Works 500
SOLUTIONS Adhesive restoration techniques restorative and integrated surgical procedures 500
Energy-Size Reduction Relationships In Comminution 500
Principles Of Comminution, I-Size Distribution And Surface Calculations 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 4950732
求助须知:如何正确求助?哪些是违规求助? 4213470
关于积分的说明 13104422
捐赠科研通 3995371
什么是DOI,文献DOI怎么找? 2186883
邀请新用户注册赠送积分活动 1202108
关于科研通互助平台的介绍 1115392