计算机科学
蒸馏
知识转移
图形
知识图
人工智能
知识管理
理论计算机科学
化学
色谱法
作者
Qian Ye,Xiaoyan Wang,Fuhui Sun,Li Pan
标识
DOI:10.1109/tnnls.2025.3525699
摘要
With the widespread application of temporal knowledge graph reasoning (TKGR) models, there is an increasing demand to reduce the memory consumption and enhance the reasoning efficiency. Knowledge distillation (KD) is a classical approach to achieve model compression and acceleration, which has been gradually introduced into the TKGR domain. Through KD, the expertise of a high-capability teacher TKGR model can be transferred to a lightweight student TKGR model. The effective transfer of reasoning knowledge primarily faces two major challenges. The first is how to extract high-quality and high-value knowledge from the teacher to the student to achieve better teaching outcomes. The second is how to encourage the teacher to improve the teaching pattern so that the knowledge is more easily assimilated by the student. Motivated by these challenges, this article firstly designs a soft-label evaluation mechanism to mitigate the problem of anomaly diffusion and knowledge transfer redundancy by measuring the confidence and entropy changes of soft labels, then proposes a mutual learning-empowered KD (MLEMKD) framework for compressing TKGR models. It refines the distribution of knowledge by analyzing the cognitive differences between teacher and student models on training samples, which enhances the acceptability of knowledge. Extensive experiments conducted on four benchmark datasets demonstrate that MLEMKD significantly outperforms existing KD methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI