过度拟合
超参数优化
粒子群优化
梯度升压
计算机科学
支持向量机
Boosting(机器学习)
超参数
人工智能
无线电频率
数学优化
机器学习
人工神经网络
数学
电信
随机森林
作者
Jiayi Wang,Shaohua Zhou
摘要
Abstract XGBoost is the optimization of gradient boosting with the best overall performance among machine learning algorithms. By introducing a regularization term into the loss function of gradient boosting, XGBoost can effectively limit the complexity of the model, improve the generalization ability, and solve the overfitting problem. In this paper, XGBoost is first introduced into modeling radio‐frequency (RF) power amplifiers (PA) under different temperatures. Furthermore, the modeling effect of XGBoost is mainly dependent on hyperparameters. As traditional grid search is time‐consuming and labor‐intensive, this paper combines particle swarm optimization (PSO) with XGBoost for searching hyperparameters. The experimental results show that XGBoost can effectively suppress the overfitting problem in gradient boosting while modeling RF PAs in different ambient temperatures. In addition, compared to classic machine learning algorithms, including support vector regression (SVR), gradient boosting, and XGBoost, the proposed PSO‐XGBoost can increase the modeling accuracy by one order of magnitude or more while also increasing the modeling speed by more than one magnitude or more. The PSO‐XGBoost model proposed in this paper can be introduced into modeling other microwave/RF devices and circuits to improve modeling accuracy and reduce modeling time.
科研通智能强力驱动
Strongly Powered by AbleSci AI