清晨好,您是今天最早来到科研通的研友!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您科研之路漫漫前行!

Graph-based Knowledge Distillation: A survey and experimental evaluation

计算机科学 蒸馏 图形 机器学习 人工智能 人工神经网络 数据挖掘 理论计算机科学 化学 有机化学
作者
Jing Liu,Tongya Zheng,Guanzheng Zhang,Qinfen Hao
出处
期刊:Cornell University - arXiv 被引量:4
标识
DOI:10.48550/arxiv.2302.14643
摘要

Graph, such as citation networks, social networks, and transportation networks, are prevalent in the real world. Graph Neural Networks (GNNs) have gained widespread attention for their robust expressiveness and exceptional performance in various graph applications. However, the efficacy of GNNs is heavily reliant on sufficient data labels and complex network models, with the former obtaining hardly and the latter computing costly. To address the labeled data scarcity and high complexity of GNNs, Knowledge Distillation (KD) has been introduced to enhance existing GNNs. This technique involves transferring the soft-label supervision of the large teacher model to the small student model while maintaining prediction performance. This survey offers a comprehensive overview of Graph-based Knowledge Distillation methods, systematically categorizing and summarizing them while discussing their limitations and future directions. This paper first introduces the background of graph and KD. It then provides a comprehensive summary of three types of Graph-based Knowledge Distillation methods, namely Graph-based Knowledge Distillation for deep neural networks (DKD), Graph-based Knowledge Distillation for GNNs (GKD), and Self-Knowledge Distillation based Graph-based Knowledge Distillation (SKD). Each type is further divided into knowledge distillation methods based on the output layer, middle layer, and constructed graph. Subsequently, various algorithms' ideas are analyzed and compared, concluding with the advantages and disadvantages of each algorithm supported by experimental results. In addition, the applications of graph-based knowledge distillation in CV, NLP, RS, and other fields are listed. Finally, the graph-based knowledge distillation is summarized and prospectively discussed. We have also released related resources at https://github.com/liujing1023/Graph-based-Knowledge-Distillation.

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
josue发布了新的文献求助10
1秒前
Ray完成签到 ,获得积分10
3秒前
yu完成签到 ,获得积分10
8秒前
20秒前
21秒前
赘婿应助科研通管家采纳,获得10
22秒前
xianyaoz完成签到 ,获得积分0
23秒前
23秒前
24秒前
木木完成签到 ,获得积分10
28秒前
彭于彦祖应助泷飞风舞2025采纳,获得30
32秒前
科研通AI2S应助迅速向日葵采纳,获得10
35秒前
威武白开水完成签到 ,获得积分10
38秒前
蓝意完成签到,获得积分0
38秒前
manmanzhong完成签到 ,获得积分10
39秒前
43秒前
坦率抽屉完成签到 ,获得积分10
44秒前
海英完成签到,获得积分10
45秒前
你吼发布了新的文献求助10
49秒前
酷酷的涵蕾完成签到 ,获得积分10
50秒前
貔貅完成签到 ,获得积分10
51秒前
天下无马完成签到 ,获得积分10
57秒前
e746700020完成签到,获得积分10
1分钟前
su完成签到 ,获得积分10
1分钟前
多亿点完成签到 ,获得积分10
1分钟前
情怀应助威武白开水采纳,获得10
1分钟前
蓉城完成签到,获得积分10
1分钟前
tyro完成签到,获得积分10
1分钟前
husky完成签到,获得积分10
1分钟前
1分钟前
蓉城发布了新的文献求助10
1分钟前
1分钟前
泷飞风舞2025完成签到,获得积分10
1分钟前
even完成签到 ,获得积分10
1分钟前
小白加油完成签到 ,获得积分10
1分钟前
贤惠的老黑完成签到 ,获得积分10
2分钟前
整齐的忆彤完成签到,获得积分10
2分钟前
大曼完成签到,获得积分10
2分钟前
2分钟前
威武白开水关注了科研通微信公众号
2分钟前
高分求助中
【请各位用户详细阅读此贴后再求助】科研通的精品贴汇总(请勿应助) 10000
The Mother of All Tableaux: Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 3000
Global Eyelash Assessment scale (GEA) 1000
Maritime Applications of Prolonged Casualty Care: Drowning and Hypothermia on an Amphibious Warship 500
Comparison analysis of Apple face ID in iPad Pro 13” with first use of metasurfaces for diffraction vs. iPhone 16 Pro 500
Towards a $2B optical metasurfaces opportunity by 2029: a cornerstone for augmented reality, an incremental innovation for imaging (YINTR24441) 500
Materials for Green Hydrogen Production 2026-2036: Technologies, Players, Forecasts 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 4054318
求助须知:如何正确求助?哪些是违规求助? 3592251
关于积分的说明 11413975
捐赠科研通 3318359
什么是DOI,文献DOI怎么找? 1825023
邀请新用户注册赠送积分活动 896271
科研通“疑难数据库(出版商)”最低求助积分说明 817418