元建模
稳健性(进化)
计算机科学
集成学习
水准点(测量)
杠杆(统计)
加权
集合预报
人工智能
稳健优化
替代模型
全局优化
机器学习
数学优化
算法
数学
程序设计语言
化学
地理
放射科
基因
医学
生物化学
大地测量学
作者
Ziliang Miao,Buwei He,Hubocheng Tang,Chen Ji-xiang,Zhenkun Wang
标识
DOI:10.1109/ccis57298.2022.10016330
摘要
This paper proposes a novel expensive global optimization method, namely Stacked Ensemble of Metamodels for Expensive Global Optimization (SEMGO ††), which aims to improve the accuracy and robustness of the surrogate. Since the existing metamodel ensemble methods leverage fixed linear weighting strategies, they are likely to result in bias when facing various problems. SEMGO employs a learning-based second-layer model to combine the predictions of the first-layer metamodels adaptively. The proposed SEMGO is compared with three state-of-the-art metamodel ensemble methods on seventeen widely used benchmark problems. The experimental results on seventeen benchmark problems show that SEMGO outperforms three state-of-the-art metamodel ensemble methods. The results show that SEMGO performs the best. In addition, the proposed method is applied to solve a practical chip packaging problem, and the previous optimization result is improved over a large margin.
科研通智能强力驱动
Strongly Powered by AbleSci AI