邻接矩阵
计算机科学
推论
图形
矩阵表示法
理论计算机科学
代表(政治)
知识表示与推理
数据建模
人工智能
机器学习
数据挖掘
有机化学
化学
群(周期表)
政治学
数据库
政治
法学
标识
DOI:10.1109/iccwamtip60502.2023.10387109
摘要
Dynamic graphs are prevalent data structures in the real world, providing powerful modeling capabilities. However, popular methods for dynamic graph representation learning often rely on graph data. Models typically require input of the entire graph's adjacency matrix and aggregate information based on node neighborhoods. This leads to challenges in model deployment and slow inference speeds. This paper combines knowledge distillation with dynamic graph representation learning, transferring graph-related knowledge learned by the teacher model to a student model. The link prediction experimental results demonstrate that introducing knowledge distillation yields a well-performing and deployable student model comparable to the teacher model. Additionally, the student model overcomes its reliance on graph data, eliminating the process of aggregating node neighborhood information through the input adjacency matrix, thereby improving inference speed.
科研通智能强力驱动
Strongly Powered by AbleSci AI