频数推理
数学
估计员
选型
广义线性模型
应用数学
线性模型
统计
维数(图论)
数学优化
贝叶斯推理
贝叶斯概率
纯数学
作者
Chaoxia Yuan,Fang Fang,Jialiang Li
出处
期刊:Econometric Reviews
日期:2023-11-22
卷期号:43 (1): 71-96
标识
DOI:10.1080/07474938.2023.2280825
摘要
While plenty of frequentist model averaging methods have been proposed, existing weight selection criteria for generalized linear models (GLM) are usually based on a model size penalized Kullback-Leibler (KL) loss or simply cross-validation. In this article, when the data is generated from an exponential distribution, we propose a novel model averaging approach for GLM motivated by an asymptotically unbiased estimator of the KL loss penalized by an “effective model size” that incorporates the model misspecification. When all the candidate models are misspecified, the proposed method achieves asymptotic optimality while allowing both the number of candidate models and the dimension of covariates to diverging. Furthermore, when correct models are included in the candidate model set, we prove that the weight of wrong candidate models converges to zero, and hence the weighted regression coefficient estimator is consistent. Simulation studies and two real-data examples demonstrate the advantage of our new method over the existing frequentist model averaging methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI