计算机科学
变压器
理论计算机科学
图形
算法
人工智能
拓扑(电路)
数学
工程类
组合数学
电压
电气工程
作者
Yunhua Lu,Kangli Zeng,Qingwei Zhang,Jun’an Zhang,Lin Cai,Jiangling Tian
标识
DOI:10.1016/j.ces.2023.119057
摘要
Graph Transformer architecture started to catch fire in molecular properties prediction due to its ability to represent complex interactions between all nodes. However, the self-attention mechanism in Transformer encoder part transforms the graph data into a fully-connected graph for graph representation learning, causing the loss of the original structural information of the graph. In this work, Local Transformer is proposed to keep the original graph structure and aggregate local information of nodes. In the model, a simple graph convolution is designed to replace the self-attention module, which reaches the state-of-the-art performance on ZINC dataset. Further, in order to rectify the problem that graph neural networks(GNNs) cannot capture long-range interactions of atoms, a novel end-to-end framework combined the GNN with Local and Global Transformer is proposed, which achieves good results on QM9 dataset.
科研通智能强力驱动
Strongly Powered by AbleSci AI