计算机科学
超参数
水准点(测量)
可扩展性
人类多任务处理
多目标优化
进化算法
贝叶斯定理
机器学习
人工智能
数据挖掘
数学优化
贝叶斯概率
数学
数据库
认知心理学
心理学
地理
大地测量学
作者
Zefeng Chen,Abhishek Gupta,Lei Zhou,Yew-Soon Ong
标识
DOI:10.1109/tcyb.2022.3214825
摘要
In an era of pervasive digitalization, the growing volume and variety of data streams poses a new challenge to the efficient running of data-driven optimization algorithms. Targeting scalable multiobjective evolution under large-instance data, this article proposes the general idea of using subsampled small-data tasks as helpful minions (i.e., auxiliary source tasks) to quickly optimize for large datasets—via an evolutionary multitasking framework. Within this framework, a novel computational resource allocation strategy is designed to enable the effective utilization of the minions while guarding against harmful negative transfers. To this end, an intertask empirical correlation measure is defined and approximated via Bayes' rule, which is then used to allocate resources online in proportion to the inferred degree of source–target correlation. In the experiments, the performance of the proposed algorithm is verified on: 1) sample average approximations of benchmark multiobjective optimization problems under uncertainty and 2) practical multiobjective hyperparameter tuning of deep neural network models. The results show that the proposed algorithm can obtain up to about 73% speedup relative to existing approaches, demonstrating its ability to efficiently tackle real-world multiobjective optimization involving evaluations on large datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI