计算机科学
进化算法
人工神经网络
人工智能
搜索算法
利用
建筑
进化计算
机器学习
领域(数学分析)
算法
数学
艺术
数学分析
计算机安全
视觉艺术
标识
DOI:10.1109/ijcnn52387.2021.9533986
摘要
Designing advanced neural architectures to tackle specific tasks involves weeks or even months of intensive investigation by experts with rich domain knowledge. In recent years, neural architecture search (NAS) has attracted the interest of many researchers due to its ability to automatically design efficient neural architectures. Among different search strategies, evolutionary algorithms have achieved significant successes as derivative-free optimization algorithms. However, the tremendous computational resource consumption of the evolutionary neural architecture search dramatically restricts its application. In this paper, we explore how fitness approximation-based evolutionary algorithms can be applied to neural architecture search and propose NAS-EA-FA to accelerate the search process. We further exploit data augmentation and diversity of neural architectures to enhance the algorithm, and present NAS-EA-FA V2. Experiments show that NAS-EA-FA V2 is at least five times faster than other state-of-the-art neural architecture search algorithms like regularized evolution and iterative neural predictor on NASBench-101, and it is also the most effective and stable algorithm on NASBench-201. All the code used in this paper is available at https://github.com/fzjcdt/NAS-EA-FA.
科研通智能强力驱动
Strongly Powered by AbleSci AI