计算机科学
可扩展性
注意力网络
图形
人工神经网络
理论计算机科学
关系(数据库)
可转让性
人工智能
限制
同种类的
机器学习
数据挖掘
数学
数据库
机械工程
组合数学
工程类
罗伊特
作者
Roshni G. Iyer,Wei Wang,Yizhou Sun
标识
DOI:10.1109/icdm51629.2021.00133
摘要
Recent graph neural networks (GNNs) with the attention mechanism have historically been limited to small-scale homogeneous graphs (HoGs). However, GNNs handling heterogeneous graphs (HeGs), which contain several entity and relation types, all have shortcomings in handling attention. Most GNNs that learn graph attention for HeGs learn either node-level or relation-level attention, but not both, limiting their ability to predict both important entities and relations in the HeG. Even the best existing method that learns both levels of attention has the limitation of assuming graph relations are independent and that its learned attention disregards this dependency association. To effectively model both multi-relational and multi-entity large-scale HeGs, we present Bi-Level Attention Graph Neural Networks (BA-GNN), scalable neural networks (NNs) that use a novel bi-level graph attention mechanism. BAGNN models both node-node and relation-relation interactions in a personalized way, by hierarchically attending to both types of information from local neighborhood contexts instead of the global graph context. Rigorous experiments on seven real-world HeGs show BA-GNN consistently outperforms all baselines, and demonstrate quality and transferability of its learned relation-level attention to improve performance of other GNNs.
科研通智能强力驱动
Strongly Powered by AbleSci AI