计算机科学
人工智能
变压器
机器学习
边距(机器学习)
语义相似性
语义记忆
图形
情报检索
理论计算机科学
物理
认知
量子力学
电压
神经科学
生物
作者
Pengfei Luo,Xi Zhu,Tong Xu,Yi Zheng,Enhong Chen
摘要
The prosperity of knowledge graphs, as well as related downstream applications, has raised the urgent need for knowledge graph completion techniques that fully support knowledge graph reasoning tasks, especially under the circumstance of training data scarcity. Although large efforts have been made on solving this challenge via few-shot learning tools, they mainly focus on simply aggregating entity neighbors to represent few-shot references, whereas the enhancement from latent semantic correlation within neighbors has been largely ignored. To that end, in this article, we propose a novel few-shot learning solution named SIM, a S emantic I nteraction M atching network that applies a Transformer framework to enhance the entity representation with capturing semantic interaction between entity neighbors. Specifically, we first design an entity-relation fusion module to adaptively encode neighbors with incorporating relation representation. Along this line, Transformer layers are integrated to capture latent correlation within neighbors, as well as the semantic diversification of the support set. Finally, a similarity score is attentively estimated with the attention mechanism. Extensive experiments on two public benchmark datasets demonstrate that our model outperforms a variety of state-of-the-art methods by a significant margin.
科研通智能强力驱动
Strongly Powered by AbleSci AI