加速度
趋同(经济学)
动量(技术分析)
数学证明
数学优化
切比雪夫滤波器
航程(航空)
二次方程
钥匙(锁)
集合(抽象数据类型)
计算机科学
封面(代数)
二次规划
数学
应用数学
物理
工程类
机械工程
几何学
计算机安全
财务
经典力学
航空航天工程
经济
计算机视觉
程序设计语言
经济增长
作者
Alexandre d’Aspremont,Damien Scieur,Adrien Taylor
出处
期刊:Foundations and trends® in optimization
[Now Publishers]
日期:2021-12-15
卷期号:5 (1-2): 1-245
被引量:37
摘要
This monograph covers some recent advances in a range of acceleration techniques frequently used in convex optimization. We first use quadratic optimization problems to introduce two key families of methods, namely momentum and nested optimization schemes. They coincide in the quadratic case to form the Chebyshev method. We discuss momentum methods in detail, starting with the seminal work of Nesterov [1] and structure convergence proofs using a few master templates, such as that for optimized gradient methods, which provide the key benefit of showing how momentum methods optimize convergence guarantees. We further cover proximal acceleration, at the heart of the Catalyst and Accelerated Hybrid Proximal Extragradient frameworks, using similar algorithmic patterns. Common acceleration techniques rely directly on the knowledge of some of the regularity parameters in the problem at hand. We conclude by discussing restart schemes, a set of simple techniques for reaching nearly optimal convergence rates while adapting to unobserved regularity parameters.
科研通智能强力驱动
Strongly Powered by AbleSci AI