蒸馏
计算机科学
图形
化学
理论计算机科学
色谱法
作者
Meng Jian,Tuo Wang,Z.D. Xia,Ge Shi,Richang Hong,Lifang Wu
摘要
The prevalent recommendation techniques explore the graph structure of interactions to alleviate the interaction sparsity issue for inferring users’ interests. These graph models focus on extracting local structural signals to model users’ interests, introducing grid-like distortion and ignoring the hierarchical tree-like structure when learning from the interaction graph. The learned interests lack significant hierarchical signals, resulting in suboptimal recommendation performance. In this paper, we investigate geometric-augmented graph learning with hyperbolic and Euclidean geometries to delve into local structural and hierarchical knowledge from the interaction graph. A self-teaching network called geometric-augmented self-distillation (GASD) is proposed to transfer hierarchical knowledge from hyperbolic to Euclidean space. The transfer learning enables shrinking of the network into a primary student to implement effective and efficient inference in Euclidean space, preventing computational burden in hyperbolic space. Experiments on publicly available datasets demonstrate that the proposed GASD outperforms the state-of-the-art models, verifying the effectiveness and efficiency of knowledge transfer by self-distillation to aggregate knowledge adaptively for personalized recommendation.
科研通智能强力驱动
Strongly Powered by AbleSci AI