计算机科学
构造(python库)
领域知识
知识库
情报检索
学习排名
噪音(视频)
集合(抽象数据类型)
知识抽取
领域(数学分析)
数据挖掘
树(集合论)
编码(集合论)
人工智能
机器学习
排名(信息检索)
图像(数学)
数学分析
程序设计语言
数学
作者
Jianyang Zhai,Xiawu Zheng,Chang‐Dong Wang,Hui Li,Yonghong Tian
标识
DOI:10.1145/3581783.3612252
摘要
Pre-trained language models (PLMs) have demonstrated strong performance in sequential recommendation (SR), which are utilized to extract general knowledge. However, existing methods still lack domain knowledge and struggle to capture users' fine-grained preferences. Meanwhile, many traditional SR methods improve this issue by integrating side information while suffering from information loss. To summarize, we believe that a good recommendation system should utilize both general and domain knowledge simultaneously. Therefore, we introduce an external knowledge base and propose Knowledge Prompt-tuning for Sequential Recommendation (KP4SR). Specifically, we construct a set of relationship templates and transform a structured knowledge graph (KG) into knowledge prompts to solve the problem of the semantic gap. However, knowledge prompts disrupt the original data structure and introduce a significant amount of noise. We further construct a knowledge tree and propose a knowledge tree mask, which restores the data structure in a mask matrix form, thus mitigating the noise problem. We evaluate KP4SR on three real-world datasets, and experimental results show that our approach outperforms state-of-the-art methods on multiple evaluation metrics. Specifically, compared with PLM-based methods, our method improves NDCG@5 and HR@5 by 40.65% and 36.42% on the books dataset, 11.17% and 11.47% on the music dataset, and 22.17% and 19.14% on the movies dataset, respectively. Our code is publicly available at the link: https://github.com/zhaijianyang/KP4SR.
科研通智能强力驱动
Strongly Powered by AbleSci AI