计算机科学
人工神经网络
训练集
原子间势
培训(气象学)
人工智能
理论(学习稳定性)
机器学习
任务(项目管理)
量子
图形
分子动力学
理论计算机科学
物理
化学
计算化学
气象学
管理
经济
量子力学
作者
John L. A. Gardner,Kathryn T. Baker,Volker L. Deringer
标识
DOI:10.1088/2632-2153/ad1626
摘要
Abstract Machine learning (ML) based interatomic potentials have transformed the field of atomistic materials modelling. However, ML potentials depend critically on the quality and quantity of quantum-mechanical reference data with which they are trained, and therefore developing datasets and training pipelines is becoming an increasingly central challenge. Leveraging the idea of ‘synthetic’ (artificial) data that is common in other areas of ML research, we here show that synthetic atomistic data, themselves obtained at scale with an existing ML potential, constitute a useful pre-training task for neural-network (NN) interatomic potential models. Once pre-trained with a large synthetic dataset, these models can be fine-tuned on a much smaller, quantum-mechanical one, improving numerical accuracy and stability in computational practice. We demonstrate feasibility for a series of equivariant graph-NN potentials for carbon, and we carry out initial experiments to test the limits of the approach.
科研通智能强力驱动
Strongly Powered by AbleSci AI