计算机科学
残余物
图形
卷积(计算机科学)
任务(项目管理)
卷积神经网络
注意力网络
方案(数学)
人工智能
连接(主束)
航程(航空)
图层(电子)
数据挖掘
人工神经网络
机器学习
理论计算机科学
算法
数学
经济
有机化学
化学
管理
材料科学
复合材料
数学分析
几何学
作者
Yongxu Long,Zihan Qiu,Dongyang Zheng,Zheng-Yang Wu,Jianguo Li,Yong Tang
出处
期刊:Communications in computer and information science
日期:2022-01-01
卷期号:: 162-172
标识
DOI:10.1007/978-981-19-4549-6_13
摘要
AbstractLink prediction on knowledge graphs (KGs) is an effective way to address their incompleteness. ConvE and InteractE have introduced CNN to this task and achieved excellent performance, but their model uses only a single 2D convolutional layer. Instead, we think that the network should go deeper. In this case, we propose the ResConvE model, which takes reference from the application of residual networks in computer vision, and deepens the neural network, and applies a skip connection to alleviate the gradient explosion and gradient disappearance caused by the deepening of the network layers. We also introduce the SKG-course dataset from Scholat for experiments. Through extensive experiments, we find that ResConvE performs well on some datasets, which proves that the idea of this method has better performance than baselines. Moreover, we also design controlled experiments setting different depths of ResConvE on FB15k and SKG-course to demonstrate that deepening the number of network layers within a certain range does help in performance improvement on different datasets.KeywordsKnowledge graph embeddingResidual networkKnowledge graphSCHOLATLink prediction
科研通智能强力驱动
Strongly Powered by AbleSci AI