计算机科学
培训(气象学)
图形
知识图
人工智能
理论计算机科学
物理
气象学
作者
LU Yi-hong,Chang‐Dong Wang,Pei-Yuan Lai,Jianhuang Lai
标识
DOI:10.1109/icdm58522.2023.00054
摘要
With the rapid growth of online platforms and the abundance of available information, personalized recommender systems have become essential for assisting users in discovering relevant and interesting content. Among the various methods, knowledge-aware recommendation model has achieved notable success by leveraging the rich semantic information encoded in knowledge graphs. However, it overlooks the fact that users' historical click sequences can better reflect their preferences within a period of time, thus imposing certain limitations on the recommendation performance. On the other hand, the application of pre-trained language models in recommender systems has demonstrated increasingly significant potential, as they can capture sequential patterns and dependencies within users' historical click sequences and effectively capture contextual information in user-item interactions. To this end, we propose a hybrid recommendation model that leverages Pre-training in the collaborative Knowledge graph Attention neTwork (PKAT), to extract both the high-order connectivity information in collaborative knowledge graphs and the contextual information in users' historical click sequences captured by Bidirectional Encoder Representations from Transformers (BERT). The collaborative knowledge graph attention network enables the model to effectively capture the intricate relationships between users, items, and knowledge entities, thus enhancing the representation learning process. Furthermore, what sets PKAT apart from other state-of-the-art knowledge-aware recommendation methods is the incorporation of the BERT language model. This integration allows PKAT to capture the contextual sequence information of user behavior, enabling it to generate more accurate and personalized recommendations. Extensive experiments are conducted on multiple benchmark datasets. And the results demonstrate that our PKAT model outperforms several state-of-the-art baselines.
科研通智能强力驱动
Strongly Powered by AbleSci AI