次梯度方法
数学
静止点
李普希茨连续性
数学优化
凸函数
有界函数
可微函数
独特性
最优化问题
算法
应用数学
正多边形
数学分析
几何学
作者
Radu Ioan Boţ,Minh N. Dao,Guoyin Li
标识
DOI:10.1287/moor.2021.1214
摘要
In this paper, we consider a broad class of nonsmooth and nonconvex fractional programs, where the numerator can be written as the sum of a continuously differentiable convex function whose gradient is Lipschitz continuous and a proper lower semicontinuous (possibly nonconvex) function, and the denominator is weakly convex over the constraint set. This model problem includes the composite optimization problems studied extensively lately, and encompasses many important modern fractional optimization problems arising from diverse areas such as the recently proposed scale invariant sparse signal reconstruction problem in signal processing. We propose a proximal subgradient algorithm with extrapolations for solving this optimization model and show that the iterated sequence generated by the algorithm is bounded and any of its limit points is a stationary point of the model problem. The choice of our extrapolation parameter is flexible and includes the popular extrapolation parameter adopted in the restarted Fast Iterative Shrinking-Threshold Algorithm (FISTA). By providing a unified analysis framework of descent methods, we establish the convergence of the full sequence under the assumption that a suitable merit function satisfies the Kurdyka--{\L}ojasiewicz (KL) property. In particular, our algorithm exhibits linear convergence for the scale invariant sparse signal reconstruction problem and the Rayleigh quotient problem over spherical constraint. In the case where the denominator is the maximum of finitely many continuously differentiable weakly convex functions, we also propose an enhanced extrapolated proximal subgradient algorithm with guaranteed convergence to a stronger notion of stationary points of the model problem. Finally, we illustrate the proposed methods by both analytical and simulated numerical examples.
科研通智能强力驱动
Strongly Powered by AbleSci AI