数学
凸性
等价(形式语言)
趋同(经济学)
应用数学
二次方程
凸函数
上下界
正多边形
功能(生物学)
数学分析
离散数学
几何学
金融经济学
生物
进化生物学
经济
经济增长
作者
Dmitriy Drusvyatskiy,Adrian S. Lewis
标识
DOI:10.1287/moor.2017.0889
摘要
The proximal gradient algorithm for minimizing the sum of a smooth and nonsmooth convex function often converges linearly even without strong convexity. One common reason is that a multiple of the step length at each iteration may linearly bound the “error”—the distance to the solution set. We explain the observed linear convergence intuitively by proving the equivalence of such an error bound to a natural quadratic growth condition. Our approach generalizes to linear and quadratic convergence analysis for proximal methods (of Gauss-Newton type) for minimizing compositions of nonsmooth functions with smooth mappings. We observe incidentally that short step-lengths in the algorithm indicate near-stationarity, suggesting a reliable termination criterion.
科研通智能强力驱动
Strongly Powered by AbleSci AI