计算机科学
过度拟合
注意力网络
节点(物理)
图形
相关性(法律)
人工神经网络
数据挖掘
理论计算机科学
人工智能
工程类
结构工程
法学
政治学
作者
Shenghang Fan,Guanjun Liu,Jian Li
标识
DOI:10.1109/tcss.2023.3239034
摘要
Heterogeneous information network (HIN) has been applied in a wide variety of graph analysis tasks. At present, it is a trend of heterogeneous graph neural networks (HGNNs) to cast the meta-paths aside, since it solves the problem of structural information loss caused by artificially designed meta-paths. However, existing meta-path-free HGNNs fail to take into account that most node types in many HINs have no attributes, and they cannot make full use of sparse node attributes when applied to HINs with missing attributes. Furthermore, their computation of attention coefficients explores the correlations of node attributes while almost ignoring structural ones, which may limit the expression ability of the model and cause overfitting in model training. To alleviate these issues, we propose an HGNN with attribute enhancement and structure-aware attention (HGNN-AESA). First, we design an attribute enhancement module (AEM) to connect more useful attributed nodes to the target nodes. Specifically, AEM introduces a random walk with restart (RWR) strategy to obtain structural relevance scores of each node within its specific subgraph. The structural relevance scores are used to capture potentially influential attributed nodes in high-order neighborhood for each target node. Second, we propose heterogeneous structure-aware attention layers (HSALs) to learn node representations. HSALs follow a hierarchical attention framework, including node-level and type-level attention. The node-level attention aggregates feature (attribute) embeddings of same-type neighbors, and the relevant attention coefficients depend on the combination of node attributes and heterogeneous structural interventions. The type-level attention fuses all type-specific vector representations and generates the ultimate node embedding. Finally, extensive experiments on three different real-world HIN datasets demonstrate that our model outperforms state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI