数学
李普希茨连续性
近端梯度法
平滑度
凸函数
公制(单位)
功能(生物学)
应用数学
算法
数学优化
回溯
凸优化
正多边形
数学分析
几何学
运营管理
进化生物学
经济
生物
摘要
In this paper, we consider a composite difference-of-convex (DC) program, whose objective function is the sum of a smooth convex function with Lipschitz continuous gradient, a proper closed and convex function, and a continuous concave function. This problem has many applications in machine learning and data science. The proximal DCA (pDCA), a special case of the classical difference-of-convex algorithm (DCA), as well as two Nesterov-type extrapolated DCA – ADCA (Phan et al. IJCAI:1369–1375, 2018) and pDCAe (Wen et al. Comput. Optim. Appl. 69:297–324, 2018) – can solve this problem. The algorithmic stepsizes of pDCA, pDCAe, and ADCA are fixed and determined by estimating a prior the smoothness parameter of the loss function. However, such an estimate may be hard to obtain or poor in some real-world applications. Motivated by this difficulty, we propose a variable metric and Nesterov extrapolated proximal DCA with backtracking (SPDCAe), which combines the backtracking line search procedure (not necessarily monotone) and the Nesterov's extrapolation for potential acceleration; moreover, the variable metric method is incorporated for better local approximation. Numerical simulations on sparse binary logistic regression and compressed sensing with Poisson noise demonstrate the effectiveness of our proposed method.
科研通智能强力驱动
Strongly Powered by AbleSci AI