Diversified branch fusion for self-knowledge distillation

计算机科学 蒸馏 融合 人工智能 化学 色谱法 语言学 哲学
作者
Zuxiang Long,Fuyan Ma,Bin Sun,Mingkui Tan,Shutao Li
出处
期刊:Information Fusion [Elsevier BV]
卷期号:90: 12-22 被引量:2
标识
DOI:10.1016/j.inffus.2022.09.007
摘要

Knowledge distillation improves the performance of a compact student network by adding supervision from a pre-trained cumbersome teacher network during training. To avoid the resource consumption of acquiring an extra teacher network, the self-knowledge distillation designs a multi-branch network architecture with shared layers for teacher and student models, which are trained collaboratively in a one-stage manner. However, this method ignores the knowledge of shallow branches and rarely provides diverse knowledge for effective collaboration of different branches. To solve these two shortcomings, this paper proposes a novel Diversified Branch Fusion approach for Self-Knowledge Distillation (DBFSKD). Firstly, we design lightweight networks for adding to the middle layers of the backbone. They capture discriminative information by global-local attention. Then we introduce a diversity loss between different branches to explore diverse knowledge. Moreover, the diverse knowledge is further integrated to form two knowledge sources by a Selective Feature Fusion (SFF) and a Dynamic Logits Fusion (DLF). Thus, the significant knowledge of shallow branches is efficiently utilized and all branches learn from each other through the fused knowledge sources. Extensive experiments with various backbone structures on four public datasets (CIFAR100, Tiny-ImageNet200, ImageNet, and RAF-DB) show superior performance of the proposed method over other methods. More importantly, the DBFSKD achieves even better performance with fewer resource consumption than the baseline. • Diversified branch fusion approach is proposed for self-knowledge distillation. • Shallow branches provide complementary information for the deep ones. • Feature and logits level fusion provides richer knowledge source for distillation. • Diversity loss encourages the branches to explore diverse knowledge. • DBFSKD obtains SOTA results in the facial expression recognition application.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
Owen应助乐小泽采纳,获得10
1秒前
辛德瑞拉发布了新的文献求助10
3秒前
Weiyu发布了新的文献求助20
4秒前
无花果应助Tina采纳,获得10
9秒前
9秒前
10秒前
CodeCraft应助墨尔根戴青采纳,获得10
10秒前
有韵好天气完成签到,获得积分10
11秒前
完美世界应助卖萌的秋田采纳,获得10
11秒前
稚祎发布了新的文献求助10
12秒前
14秒前
Anri完成签到,获得积分10
15秒前
15秒前
冷酷的戎完成签到 ,获得积分10
16秒前
顾矜应助按时下班采纳,获得10
16秒前
叶子星期一关注了科研通微信公众号
17秒前
Xiang发布了新的文献求助10
18秒前
19秒前
瑞瑞刘完成签到 ,获得积分10
19秒前
22秒前
22秒前
RBE小陈完成签到 ,获得积分10
22秒前
思源应助喵喵采纳,获得10
23秒前
23秒前
会放电的皮卡丘完成签到,获得积分10
24秒前
24秒前
CodeCraft应助多乐采纳,获得10
25秒前
coolhoo发布了新的文献求助10
26秒前
脑洞疼应助科研通管家采纳,获得10
26秒前
26秒前
完美世界应助科研通管家采纳,获得30
26秒前
zho应助科研通管家采纳,获得10
26秒前
科研通AI5应助科研通管家采纳,获得10
26秒前
FashionBoy应助科研通管家采纳,获得30
26秒前
Fanfan发布了新的文献求助10
26秒前
天天快乐应助科研通管家采纳,获得10
26秒前
大个应助科研通管家采纳,获得10
26秒前
科研通AI5应助科研通管家采纳,获得10
26秒前
cc发布了新的文献求助10
27秒前
烟花应助科研通管家采纳,获得10
27秒前
高分求助中
Worked Bone, Antler, Ivory, and Keratinous Materials 1000
Algorithmic Mathematics in Machine Learning 500
Разработка метода ускоренного контроля качества электрохромных устройств 500
Getting Published in SSCI Journals: 200+ Questions and Answers for Absolute Beginners 300
Advances in Underwater Acoustics, Structural Acoustics, and Computational Methodologies 300
The Monocyte-to-HDL ratio (MHR) as a prognostic and diagnostic biomarker in Acute Ischemic Stroke: A systematic review with meta-analysis (P9-14.010) 240
Dynamic Programming and Optimal Control 200
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3830011
求助须知:如何正确求助?哪些是违规求助? 3372520
关于积分的说明 10473113
捐赠科研通 3092110
什么是DOI,文献DOI怎么找? 1701802
邀请新用户注册赠送积分活动 818638
科研通“疑难数据库(出版商)”最低求助积分说明 770986