Graph-based Knowledge Distillation: A survey and experimental evaluation

计算机科学 蒸馏 图形 机器学习 人工智能 人工神经网络 数据挖掘 理论计算机科学 化学 有机化学
作者
Jing Liu,Tongya Zheng,Guanzheng Zhang,Qinfen Hao
出处
期刊:Cornell University - arXiv 被引量:4
标识
DOI:10.48550/arxiv.2302.14643
摘要

Graph, such as citation networks, social networks, and transportation networks, are prevalent in the real world. Graph Neural Networks (GNNs) have gained widespread attention for their robust expressiveness and exceptional performance in various graph applications. However, the efficacy of GNNs is heavily reliant on sufficient data labels and complex network models, with the former obtaining hardly and the latter computing costly. To address the labeled data scarcity and high complexity of GNNs, Knowledge Distillation (KD) has been introduced to enhance existing GNNs. This technique involves transferring the soft-label supervision of the large teacher model to the small student model while maintaining prediction performance. This survey offers a comprehensive overview of Graph-based Knowledge Distillation methods, systematically categorizing and summarizing them while discussing their limitations and future directions. This paper first introduces the background of graph and KD. It then provides a comprehensive summary of three types of Graph-based Knowledge Distillation methods, namely Graph-based Knowledge Distillation for deep neural networks (DKD), Graph-based Knowledge Distillation for GNNs (GKD), and Self-Knowledge Distillation based Graph-based Knowledge Distillation (SKD). Each type is further divided into knowledge distillation methods based on the output layer, middle layer, and constructed graph. Subsequently, various algorithms' ideas are analyzed and compared, concluding with the advantages and disadvantages of each algorithm supported by experimental results. In addition, the applications of graph-based knowledge distillation in CV, NLP, RS, and other fields are listed. Finally, the graph-based knowledge distillation is summarized and prospectively discussed. We have also released related resources at https://github.com/liujing1023/Graph-based-Knowledge-Distillation.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
weixuefeng发布了新的文献求助10
1秒前
1秒前
2秒前
胖大海完成签到,获得积分10
2秒前
不爱吃苹果完成签到,获得积分10
2秒前
量子星尘发布了新的文献求助10
3秒前
闪闪书竹完成签到,获得积分10
3秒前
聪明的元彤完成签到,获得积分10
3秒前
Lit-Tse完成签到,获得积分10
3秒前
3秒前
Jie_huang完成签到,获得积分10
3秒前
NexusExplorer应助宇文书翠采纳,获得10
3秒前
水123发布了新的文献求助10
4秒前
4秒前
4秒前
fff发布了新的文献求助10
4秒前
天天快乐应助lachine采纳,获得10
4秒前
cuicui完成签到,获得积分10
4秒前
allglitters完成签到,获得积分10
5秒前
5秒前
5秒前
时尚灵竹完成签到,获得积分10
5秒前
跑快点完成签到,获得积分10
5秒前
5秒前
16完成签到,获得积分10
6秒前
zjqfree完成签到,获得积分10
6秒前
ABEDO完成签到,获得积分10
6秒前
量子星尘发布了新的文献求助10
6秒前
程公子完成签到,获得积分10
7秒前
Shannon完成签到,获得积分10
7秒前
子云完成签到,获得积分10
8秒前
ctttt发布了新的文献求助10
8秒前
8秒前
李健的小迷弟应助夏天采纳,获得10
9秒前
lyy发布了新的文献求助10
9秒前
cccdida应助小龙采纳,获得10
9秒前
J_Man发布了新的文献求助10
9秒前
云儿发布了新的文献求助10
9秒前
香蕉觅云应助小离采纳,获得10
10秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Binary Alloy Phase Diagrams, 2nd Edition 8000
Building Quantum Computers 800
Translanguaging in Action in English-Medium Classrooms: A Resource Book for Teachers 700
Natural Product Extraction: Principles and Applications 500
Exosomes Pipeline Insight, 2025 500
Qualitative Data Analysis with NVivo By Jenine Beekhuyzen, Pat Bazeley · 2024 500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5665264
求助须知:如何正确求助?哪些是违规求助? 4875562
关于积分的说明 15112548
捐赠科研通 4824343
什么是DOI,文献DOI怎么找? 2582710
邀请新用户注册赠送积分活动 1536677
关于科研通互助平台的介绍 1495284