计算机科学
变压器
图形
数据挖掘
理论计算机科学
人工智能
工程类
电气工程
电压
作者
Jian-Kuen Wu,Weipeng Lu,J. Wu,Bing Zhang,Yanchuan Guo
标识
DOI:10.1021/acs.jpca.5c03006
摘要
The prediction of the projected density of states (PDOS) in materials has traditionally relied on deep learning models based on graph convolutional networks (GCN) and Graph Attention Networks (GAT). In this study, utilizing PDOS data from the Materials Project, we demonstrate that the Graph Transformer (GT)─which employs multi-head scaled dot-product attention over each node and its local neighborhood─consistently outperforms both GAT and GCN models in prediction accuracy under identical energy-level-corrected conditions. Furthermore, by incorporating the valence electron counts of the s, p, d, and f orbitals for the entire structure as an additional feature into a GT-based model, we further enhance the model's predictive performance. The novel framework and physics-informed feature enhancements offer valuable insights into improving the prediction of PDOS and potentially other electronic properties.
科研通智能强力驱动
Strongly Powered by AbleSci AI