人工神经网络
操作员(生物学)
黑匣子
偏微分方程
算符理论
推论
非线性系统
计算机科学
傅里叶积分算子
应用数学
微分算子
人工智能
算法
物理
数学
数学分析
量子力学
生物化学
化学
抑制因子
转录因子
基因
作者
Somdatta Goswami,Aniruddha Bora,Yue Yu,George Em Karniadakis
出处
期刊:Computational methods in engineering & the sciences
日期:2023-01-01
卷期号:: 219-254
标识
DOI:10.1007/978-3-031-36644-4_6
摘要
Standard neural networks can approximate general nonlinear operators, represented either explicitly by a combination of mathematical operators, e.g. in an advection–diffusion reaction partial differential equation, or simply as a black box, e.g. a system-of-systems. The first neural operator was the Deep Operator Network (DeepONet) proposed in 2019 based on rigorous approximation theory. Since then, a few other less general operators have been published, e.g. based on graph neural networks or Fourier transforms. For black-box systems, training of neural operators is data driven only but if the governing equations are known they can be incorporated into the loss function during training to develop physics-informed neural operators. Neural operators can be used as surrogates in design problems, uncertainty quantification, autonomous systems, and almost any application requiring real-time inference. Moreover, independently pre-trained DeepONets can be used as components of a complex multi-physics system by coupling them together with relatively light training. Here, we present a review of DeepONet, the Fourier neural operator, and the graph neural operator, as well as appropriate extensions with feature expansions, and highlight their usefulness in diverse applications in computational mechanics, including porous media, fluid mechanics, and solid mechanics.
科研通智能强力驱动
Strongly Powered by AbleSci AI