黑森矩阵
数学
黑森方程
牛顿法
拟牛顿法
收敛速度
应用数学
基质(化学分析)
数学分析
非线性系统
电气工程
材料科学
复合材料
工程类
频道(广播)
物理
偏微分方程
一阶偏微分方程
量子力学
摘要
In this paper, we investigate how the Gauss–Newton Hessian matrix affects the basin of convergence in Newton-type methods. Although the Newton algorithm is theoretically superior to the Gauss–Newton algorithm and the Levenberg–Marquardt (LM) method as far as their asymptotic convergence rate is concerned, the LM method is often preferred in nonlinear least squares problems in practice. This paper presents a theoretical analysis of the advantage of the Gauss–Newton Hessian matrix. It is proved that the Gauss–Newton approximation function is the only nonnegative convex quadratic approximation that retains a critical property of the original objective function: taking the minimal value of zero on an $(n-1)$-dimensional manifold (or affine subspace). Due to this property, the Gauss–Newton approximation does not change the zero-on-$(n-1)$-D “structure” of the original problem, explaining the reason why the Gauss–Newton Hessian matrix is preferred for nonlinear least squares problems, especially when the initial point is far from the solution.
科研通智能强力驱动
Strongly Powered by AbleSci AI