计算机科学
工作流程
架空(工程)
人工神经网络
还原(数学)
点(几何)
量子
机器学习
从头算
算法
人工智能
物理
量子力学
数学
几何学
数据库
操作系统
作者
Jan Finkbeiner,Samuel Tovey,Christian Holm
标识
DOI:10.1103/physrevlett.132.167301
摘要
This Letter presents a novel approach for identifying uncorrelated atomic configurations from extensive datasets with a nonstandard neural network workflow known as random network distillation (RND) for training machine-learned interatomic potentials (MLPs). This method is coupled with a DFT workflow wherein initial data are generated with cheaper classical methods before only the minimal subset is passed to a more computationally expensive ab initio calculation. This benefits training not only by reducing the number of expensive DFT calculations required but also by providing a pathway to the use of more accurate quantum mechanical calculations. The method's efficacy is demonstrated by constructing machine-learned interatomic potentials for the molten salts KCl and NaCl. Our RND method allows accurate models to be fit on minimal datasets, as small as 32 configurations, reducing the required structures by at least 1 order of magnitude compared to alternative methods. This reduction in dataset sizes not only substantially reduces computational overhead for training data generation but also provides a more comprehensive starting point for active-learning procedures.
科研通智能强力驱动
Strongly Powered by AbleSci AI