朗之万动力
遍历性
朗之万方程
数学
趋同(经济学)
平均场理论
收敛速度
梯度下降
统计物理学
应用数学
哈密顿量(控制论)
数学优化
计算机科学
人工神经网络
物理
人工智能
统计
频道(广播)
经济
量子力学
计算机网络
经济增长
作者
Anna Kazeykina,Zhenjie Ren,Xiaolu Tan,Junjian Yang
摘要
We study the long time behavior of an underdamped mean-field Langevin (MFL) equation, and provide a general convergence as well as an exponential convergence rate result under different conditions. The results on the MFL equation can be applied to study the convergence of the Hamiltonian gradient descent algorithm for the overparametrized optimization. We then provide some numerical examples of the algorithm to train a generative adversarial network (GAN).
科研通智能强力驱动
Strongly Powered by AbleSci AI