共线性
估计员
普通最小二乘法
均方误差
数学
统计
估计量的偏差
James–Stein估计量
回归
回归分析
线性回归
应用数学
最小方差无偏估计量
标识
DOI:10.1081/sta-120019959
摘要
Abstract Linear regression model and least squares method are widely used in many fields of natural and social sciences. In the presence of collinearity, the least squares estimator is unstable and often gives misleading information. Ridge regression is the most common method to overcome this problem. We find that when there exists severe collinearity, the shrinkage parameter selected by existing methods for ridge regression may not fully address the ill conditioning problem. To solve this problem, we propose a new two-parameter estimator. We show using both theoretic results and simulation that our new estimator has two advantages over ridge regression. First, our estimator has less mean squared error (MSE). Second, our estimator can fully address the ill conditioning problem. A numerical example from literature is used to illustrate the results.
科研通智能强力驱动
Strongly Powered by AbleSci AI