共轭梯度法
梯度下降
下降方向
趋同(经济学)
行搜索
数学优化
下降(航空)
凸性
财产(哲学)
非线性共轭梯度法
数学
算法
功能(生物学)
梯度法
帕累托原理
计算机科学
应用数学
人工智能
物理
哲学
计算机安全
半径
经济
气象学
经济增长
生物
人工神经网络
进化生物学
认识论
金融经济学
作者
Jamilu Yahaya,Poom Kumam,Sani Salisu,Kanokwan Sıtthıthakerngkıet
出处
期刊:PLOS ONE
[Public Library of Science]
日期:2024-05-15
卷期号:19 (5): e0302441-e0302441
被引量:3
标识
DOI:10.1371/journal.pone.0302441
摘要
Several conjugate gradient (CG) parameters resulted in promising methods for optimization problems. However, it turns out that some of these parameters, for example, ‘PRP,’ ‘HS,’ and ‘DL,’ do not guarantee sufficient descent of the search direction. In this work, we introduce new spectral-like CG methods that achieve sufficient descent property independently of any line search (LSE) and for arbitrary nonnegative CG parameters. We establish the global convergence of these methods for four different parameters using Wolfe LSE. Our algorithm achieves this without regular restart and assumption of convexity regarding the objective functions. The sequences generated by our algorithm identify points that satisfy the first-order necessary condition for Pareto optimality. We conduct computational experiments to showcase the implementation and effectiveness of the proposed methods. The proposed spectral-like methods, namely nonnegative SPRP, SHZ, SDL, and SHS, exhibit superior performance based on their arrangement, outperforming HZ and SP methods in terms of the number of iterations, function evaluations, and gradient evaluations.
科研通智能强力驱动
Strongly Powered by AbleSci AI