计算机科学
共轭梯度法
稳健性(进化)
收敛速度
趋同(经济学)
数学优化
算法
非线性共轭梯度法
随机微分方程
李雅普诺夫函数
加速度
随机优化
数学
应用数学
梯度下降
人工神经网络
人工智能
钥匙(锁)
经济
计算机安全
经济增长
化学
生物化学
经典力学
量子力学
物理
非线性系统
基因
作者
Yulan Yuan,Danny H. K. Tsang,Vincent K. N. Lau
标识
DOI:10.1109/jiot.2024.3376821
摘要
Due to the influence of stochastic gradients, the existing algorithms suffer from slow convergence, noise explosion, and even failure to converge in practice, which motivates us to propose an accelerated algorithm to tackle these issues. Recognizing the potential of gradient, momentum, and conjugate gradient as promising search directions, we propose a 3-D acceleration algorithm, which uses a weighted combination of these three basis. Specifically, in order to analyze the dynamics of the discrete-time algorithm during the update process, we provide a general framework for approximating the discrete-time algorithm in the weak sense by a continuous-time stochastic differential equation. We exploit the continuous-time formulation together with Lyapunov drift optimization to derive novel adaptive step sizes, which effectively improve the performance of the algorithm in stabilizing noise and accelerating convergence. Extensive numerical experiments demonstrate the proposed algorithm's superiority in convergence rate, computation complexity, and noise robustness compared to state-of-the-art baselines.
科研通智能强力驱动
Strongly Powered by AbleSci AI