计算机科学
图形
编码
知识图
人工智能
理论计算机科学
弹丸
生物化学
基因
有机化学
化学
作者
Yi Liang,Shuai Zhao,Bo Cheng,Hao Yang
标识
DOI:10.1007/978-3-031-40283-8_20
摘要
Recent years have witnessed a growing number of studies on few-shot knowledge graph completion (FSKGC), which aims to infer new facts for relations given its few-shot observed samples. Despite current research’s great success in static knowledge graphs, few-shot temporal knowledge graph completion (FSTKGC) has not been well explored yet. Existing FSTKGC solutions mainly face two challenges. First, these models fail to distinguish the contribution of neighbors and model the difference between recurring and ever-changing facts. Second, they ignore the latent evolution patterns from observed temporal samples when learning relation representations. In this paper, we propose a novel framework named TwinGAT-VEDA with twin graph attention and an evolution pattern learner to address the above issues. First, our model devises two graph attention network (the twins) to aggregate most relative signals from recurring and dynamic neighbors separately and automatically fuses these futures based on the interaction between the subject and object. Secondly, we inject the time-differences to encode entity pairs and learn evolution patterns from few-shot reference sequence to represent few-shot relations. Comprehensive experiments on two benchmark datasets ICEWS-few-intp and GDELT-few-intp demonstrate that TwinGAT-VEDA achieves the state-of-the-art results.
科研通智能强力驱动
Strongly Powered by AbleSci AI