数学
方差减少
随机梯度下降算法
下降(航空)
还原(数学)
应用数学
近端梯度法
差异(会计)
数学优化
梯度下降
正多边形
凸优化
几何学
统计
计算机科学
人工智能
蒙特卡罗方法
人工神经网络
会计
工程类
业务
航空航天工程
作者
Zehui Jia,Wenxing Zhang,Xingju Cai,Deren Han
摘要
The blocky optimization has gained a significant amount of attention in far-reaching practical applications. Following the recent work (M. Nikolova and P. Tan [SIAM J. Optim. 29 (2019), pp. 2053–2078]) on solving a class of nonconvex nonsmooth optimization, we develop a stochastic alternating structure-adapted proximal (s-ASAP) gradient descent method for solving blocky optimization problems. By deploying some state-of-the-art variance reduced gradient estimators (rather than full gradient) in stochastic optimization, the s-ASAP method is applicable to nonconvex optimization whose objective is the sum of a nonsmooth data-fitting term and a finite number of differentiable functions. The sublinear convergence rate of s-ASAP is built upon the proximal point algorithmic framework, whilst the linear convergence rate of s-ASAP is achieved under the error bound condition. Furthermore, the convergence of the sequence produced by s-ASAP is established under the Kurdyka-Łojasiewicz property. Preliminary numerical simulations on some image processing applications demonstrate the compelling performance of the proposed method.
科研通智能强力驱动
Strongly Powered by AbleSci AI