参数化复杂度
人工神经网络
计算机科学
试验台
离散化
背景(考古学)
人工智能
偏微分方程
机器学习
趋同(经济学)
算法
数学
数学分析
计算机网络
古生物学
经济
生物
经济增长
作者
Michael Penwarden,Shandian Zhe,Akil Narayan,Robert M. Kirby
标识
DOI:10.1016/j.jcp.2023.111912
摘要
Physics-informed neural networks (PINNs) as a means of discretizing partial differential equations (PDEs) are garnering much attention in the Computational Science and Engineering (CS&E) world. At least two challenges exist for PINNs at present: an understanding of accuracy and convergence characteristics with respect to tunable parameters and identification of optimization strategies that make PINNs as efficient as other computational science tools. The cost of PINNs training remains a major challenge of Physics-informed Machine Learning (PiML) – and, in fact, machine learning (ML) in general. This paper is meant to move towards addressing the latter through the study of PINNs on new tasks, for which parameterized PDEs provides a good testbed application as tasks can be easily defined in this context. Following the ML world, we introduce metalearning of PINNs with application to parameterized PDEs. By introducing metalearning and transfer learning concepts, we can greatly accelerate the PINNs optimization process. We present a survey of model-agnostic metalearning, and then discuss our model-aware metalearning applied to PINNs as well as implementation considerations and algorithmic complexity. We then test our approach on various canonical forward parameterized PDEs that have been presented in the emerging PINNs literature.
科研通智能强力驱动
Strongly Powered by AbleSci AI