计算机科学
推论
变压器
知识图
生成语法
图形
人工智能
语言模型
自然语言处理
解码方法
机器学习
理论计算机科学
算法
量子力学
物理
电压
作者
Xin Xie,Ningyu Zhang,Zhoubo Li,Shumin Deng,Hui Chen,Feiyu Xiong,Mosha Chen,Huajun Chen
标识
DOI:10.1145/3487553.3524238
摘要
Knowledge graph completion aims to address the problem of extending a KG with missing triples. In this paper, we provide an approach GenKGC, which converts knowledge graph completion to sequence-to-sequence generation task with the pre-trained language model. We further introduce relation-guided demonstration and entity-aware hierarchical decoding for better representation learning and fast inference. Experimental results on three datasets show that our approach can obtain better or comparable performance than baselines and achieve faster inference speed compared with previous methods with pre-trained language models. We also release a new large-scale Chinese knowledge graph dataset AliopenKG500 for research purpose. Code and datasets are available in https://github.com/zjunlp/PromptKG/tree/main/GenKGC.
科研通智能强力驱动
Strongly Powered by AbleSci AI