Lanczos重采样
共轭梯度法
数学
应用数学
残余物
最小二乘函数近似
理论(学习稳定性)
算法
特征向量
数学优化
统计
计算机科学
量子力学
机器学习
物理
估计员
作者
Åke Björck,Tommy Elfving,Zdeněk Strakoš
标识
DOI:10.1137/s089547989631202x
摘要
{The conjugate gradient method applied to the normal equations ATAx=ATb (CGLS) is often used for solving large sparse linear least squares problems. The mathematically equivalent algorithm LSQR based on the Lanczos bidiagonalization process is an often recommended alternative. In this paper, the achievable accuracy of different conjgate gradient and Lanczos methods in finite precision is studied. It is shown that an implementation of algorithm CGLS in which the residual sk=AT(b-Axk) of the normal equations is recurred will not in general achieve accurate solutions. The same conclusion holds for the method based on Lanczos bidiagonalization with starting vector ATb. For the preferred implementation of CGLS we bound the error ||r-rk|| of the computed residual rk. Numerical tests are given that confirm a conjecture of backward stability. The achievable accuracy of LSQR is shown to be similar. The analysis essentially also covers the preconditioned case.
科研通智能强力驱动
Strongly Powered by AbleSci AI