后悔
遗忘
因子(编程语言)
凸函数
随机梯度下降算法
符号
梯度下降
正多边形
下降(航空)
数学优化
数学
计算机科学
算法
人工智能
机器学习
算术
人工神经网络
工程类
程序设计语言
航空航天工程
哲学
语言学
几何学
作者
Y M Liu,Wenxiao Zhao,George Yin
标识
DOI:10.1109/tac.2023.3340120
摘要
This paper develops a class of novel algorithms for online convex optimization. The key construct is a forgetting-factor regret. It introduces weights to the objective functions at each time instant $\mathbf {t}$ and allows the weights of the past objective functions decaying to zero. We establish the forgetting-factor regret bounds of classical algorithms including online gradient descent algorithms, online gradient-free algorithms, and online Frank-Wolfe algorithms. In addition, the paper introduces online gradient descent algorithm with a forgetting factor, and analyze its performance under the new regret. Sufficient conditions are obtained to guarantee the bounds of the forgetting-factor regret of the above algorithms being of the order $\mathbf {o(1)}$ , which guarantees the tracking performance for minimizers of time-varying objective functions. Finally, our results are tested through numerical demonstration.
科研通智能强力驱动
Strongly Powered by AbleSci AI