行搜索
共轭梯度法
李普希茨连续性
数学
趋同(经济学)
非线性共轭梯度法
梯度下降
数学优化
直线(几何图形)
应用数学
有界函数
集合(抽象数据类型)
计算机科学
数学分析
几何学
人工智能
计算机安全
经济
人工神经网络
半径
程序设计语言
经济增长
作者
Ahmad Alhawarat,Zabidin Salleh,Mustafa Mamat,Mohd Rivaie
标识
DOI:10.1080/10556788.2016.1266354
摘要
The conjugate gradient (CG) method is one of the most popular methods for solving large-scale unconstrained optimization problems. In this paper, a new modified version of the CG formula that was introduced by Polak, Ribière, and Polyak is proposed for problems that are bounded below and have a Lipschitz-continuous gradient. The new parameter provides global convergence properties when the strong Wolfe-Powell (SWP) line search or the weak Wolfe-Powell (WWP) line search is employed. A proof of a sufficient descent condition is provided for the SWP line search. Numerical comparisons between the proposed parameter and other recent CG modifications are made on a set of standard unconstrained optimization problems. The numerical results demonstrate the efficiency of the proposed CG parameter compared with the other CG parameters.
科研通智能强力驱动
Strongly Powered by AbleSci AI