计算机科学
超图
成对比较
利用
图形
理论计算机科学
特征学习
数据挖掘
节点(物理)
人工神经网络
人工智能
变压器
机器学习
情报检索
数学
工程类
物理
离散数学
结构工程
电压
量子力学
计算机安全
作者
Mengran Li,Yong Zhang,Xiaoyong Li,Yuchen Zhang,Baocai Yin
出处
期刊:ACM Transactions on Knowledge Discovery From Data
[Association for Computing Machinery]
日期:2023-04-07
卷期号:17 (5): 1-22
被引量:6
摘要
Graph neural networks (GNNs) have been widely used for graph structure learning and achieved excellent performance in tasks such as node classification and link prediction. Real-world graph networks imply complex and various semantic information and are often referred to as heterogeneous information networks (HINs). Previous GNNs have laboriously modeled heterogeneous graph networks with pairwise relations, in which the semantic information representation for learning is incomplete and severely hinders node embedded learning. Therefore, the conventional graph structure cannot satisfy the demand for information discovery in HINs. In this article, we propose an end-to-end hypergraph transformer neural network (HGTN) that exploits the communication abilities between different types of nodes and hyperedges to learn higher-order relations and discover semantic information. Specifically, attention mechanisms weigh the importance of semantic information hidden in original HINs to generate useful meta-paths. Meanwhile, our method develops a multi-scale attention module to aggregate node embeddings in higher-order neighborhoods. We evaluate the proposed model with node classification tasks on six datasets: DBLP, ACM, IBDM, Reuters, STUD-BJUT, and Citeseer. Experiments on a large number of benchmarks show the advantages of HGTN.
科研通智能强力驱动
Strongly Powered by AbleSci AI