计算机科学
人工神经网络
水准点(测量)
进化算法
数学优化
辍学(神经网络)
贝叶斯优化
多目标优化
最优化问题
替代模型
机器学习
人工智能
算法
数学
大地测量学
地理
作者
Dan Guo,Xilu Wang,Kailai Gao,Yaochu Jin,Jinliang Ding,Tianyou Chai
标识
DOI:10.1109/tsmc.2020.3044418
摘要
Gaussian processes (GPs) are widely used in surrogate-assisted evolutionary optimization of expensive problems mainly due to the ability to provide a confidence level of their outputs, making it possible to adopt principled surrogate management methods, such as the acquisition function used in the Bayesian optimization. Unfortunately, GPs become less practical for high-dimensional multiobjective and many-objective optimization as their computational complexity is cubic in the number of training samples. In this article, we propose a computationally efficient dropout neural network (EDN) to replace the Gaussian process and a new model management strategy to achieve a good balance between convergence and diversity for assisting evolutionary algorithms to solve high-dimensional multiobjective and many-objective expensive optimization problems. While the conventional dropout neural network needs to save a large number of network models during the training for calculating the confidence level, only one single network model is needed in the EDN to estimate the fitness and its confidence level by randomly ignoring neurons in both training and testing the neural network. Extensive experimental studies on benchmark problems with up to 100 decision variables and 20 objectives demonstrate that, compared to state of the art, the proposed algorithm is not only highly competitive in performance but also computationally more scalable to high-dimensional many-objective optimization problems. Finally, the proposed algorithm is validated on an operational optimization problem of crude oil distillation units, further confirming its capability of handling expensive problems given a limited computational budget.
科研通智能强力驱动
Strongly Powered by AbleSci AI