计算机科学
人工智能
替代模型
机器学习
水准点(测量)
进化算法
约束(计算机辅助设计)
数学优化
数学
几何学
大地测量学
地理
作者
Kuangda Lyu,Hao Li,Maoguo Gong,Lining Xing,A. K. Qin
标识
DOI:10.1109/tevc.2023.3319567
摘要
Multiobjective neural architecture search (MONAS) methods based on evolutionary algorithms (EAs) are inefficient when the evaluation of each architecture incorporates parameter learning from scratch. A surrogate-assisted MONAS problem can be tough considering cold-start in surrogate construction, and the evaluation of predicted promising architectures could still be cumbersome. Previously solved MONAS problems are likely to convey useful knowledge that could assist solving the current MONAS problem. To take the benefit from knowledge of these previous practices, a framework tackling large-scale knowledge transfer is proposed. Through sparse-constraint transfer stacking, the surrogate for the current problem gets informative easily. With knee-region knowledge distillation from previously learned parameters of nondominated architectures, evaluation of current architectures could be efficient and credible. To avoid transferring knowledge from irrelevant problems, an iterative source selection algorithm is designed to avoid negative transfer. The proposed framework is analyzed under different source and target MONAS problem combinations. Results show that with the help of this framework, architectures with competitive performance could be found under limited evaluation budget.
科研通智能强力驱动
Strongly Powered by AbleSci AI