一般化
适应(眼睛)
计算机科学
分布(数学)
统计物理学
生物
物理
数学
神经科学
数学分析
作者
Taoyong Cui,Chenyu Tang,Dongzhan Zhou,Yuqiang Li,Xin-Gao Gong,Wanli Ouyang,Mao Su,Shufei Zhang
标识
DOI:10.1038/s41467-025-57101-4
摘要
Machine learning interatomic potentials (MLIPs) enable more efficient molecular dynamics (MD) simulations with ab initio accuracy, which have been used in various domains of physical science. However, distribution shift between training and test data causes deterioration of the test performance of MLIPs, and even leads to collapse of MD simulations. In this work, we propose an online Test-time Adaptation Interatomic Potential (TAIP) framework to improve the generalization on test data. Specifically, we design a dual-level self-supervised learning approach that leverages global structure and atomic local environment information to align the model with the test data. Extensive experiments demonstrate TAIP's capability to bridge the domain gap between training and test dataset without additional data. TAIP enhances the test performance on various benchmarks, from small molecule datasets to complex periodic molecular systems with various types of elements. TAIP also enables stable MD simulations where the corresponding baseline models collapse. Molecular dynamics simulations using machine learning interatomic potentials often face stability issues due to distribution shifts. Here, the authors develop an online test-time adaptation framework to improve generalization, allowing for more stable simulations without the need for additional training data.
科研通智能强力驱动
Strongly Powered by AbleSci AI