偏微分方程
背景(考古学)
可微函数
人工神经网络
非线性系统
偏导数
物理
计算机科学
班级(哲学)
随机偏微分方程
物理定律
功能(生物学)
应用数学
人工智能
数学
数学分析
生物
进化生物学
古生物学
量子力学
作者
Maziar Raissi,Paris Perdikaris,George Em Karniadakis
出处
期刊:Cornell University - arXiv
日期:2017-01-01
被引量:808
标识
DOI:10.48550/arxiv.1711.10561
摘要
We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations. In this two part treatise, we present our developments in the context of solving two main classes of problems: data-driven solution and data-driven discovery of partial differential equations. Depending on the nature and arrangement of the available data, we devise two distinct classes of algorithms, namely continuous time and discrete time models. The resulting neural networks form a new class of data-efficient universal function approximators that naturally encode any underlying physical laws as prior information. In this first part, we demonstrate how these networks can be used to infer solutions to partial differential equations, and obtain physics-informed surrogate models that are fully differentiable with respect to all input coordinates and free parameters.
科研通智能强力驱动
Strongly Powered by AbleSci AI