计算机科学
初始化
梯度下降
建筑
人工神经网络
进化算法
人工智能
网络体系结构
搜索成本
机器学习
艺术
计算机安全
微观经济学
经济
视觉艺术
程序设计语言
作者
Zicheng Cai,Lei Chen,Shaoda Zeng,Yutao Lai,Hai-Lin Liu
标识
DOI:10.1016/j.asoc.2023.110624
摘要
Using weight-sharing and continuous relaxation strategies, the gradient descent-based differential architecture search has achieved great success in automatically designing neural network architectures. However, unresolved issues, i.e., the local optimum dilemma of the gradient descent method, and the network performance collapse of the searched architecture with too many unreasonable operations, are still frustrating for researchers and practitioners. To address these two issues, a novel and efficient neural architecture search approach based on a hybrid evolutionary strategy, termed EST-NAS, is proposed in this paper. In particular, we propose using a new evolutionary strategy to explore various search directions based on the gradient descent-based neural network architecture search, aiming at obtaining a more excellent architecture. In the proposed EST-NAS, the gradient descent architecture search is performed first, and then the best architecture obtained is utilized to design an efficient initialization for the following evolutionary strategy-based architecture search. By hybridizing evolutionary strategy with gradient descent-based search, EST-NAS can improve the performance of the searched architecture with better search efficiency. Meanwhile, the validation accuracy is applied to directly measure the importance of operations, which reduces the error in the relationship between operation and task performance. Extensive experimental results in the various datasets on different search spaces show that the proposed EST-NAS achieves remarkably competitive performance with less search cost, compared to other state-of-the-art NAS approaches.
科研通智能强力驱动
Strongly Powered by AbleSci AI