计算机科学
人工神经网络
人工智能
渡线
进化算法
遗传算法
进化计算
深度学习
邻接表
机器学习
算法
作者
Yu Xue,Chen Chen,Adam Słowik
标识
DOI:10.1109/tevc.2023.3252612
摘要
With the emergence of deep neural networks, many research fields, such as image classification, object detection, speech recognition, natural language processing, machine translation, and automatic driving, have made major breakthroughs in technology and the research achievements have been successfully applied in many real-life applications. Combining evolutionary computation and neural architecture search (NAS) is an important approach to improve the performance of deep neural networks. Usually, the related researchers only focus on precision. Thus, the searched neural architectures always perform poorly in the other indexes such as time cost. In this article, a multi-objective evolutionary algorithm with a probability stack (MOEA-PS) is proposed for NAS, which considers the two objects of precision and time consumption. MOEA-PS uses an adjacency list to represent the internal structure of deep neural networks. Besides, a unique mechanism is introduced into the multi-objective genetic algorithm to guide the process of crossover and mutation when generating offspring. Furthermore, the structure blocks are stacked using a proxy model to generate deep neural networks. The results of the experiments on Cifar-10 and Cifar-100 demonstrate that the proposed algorithm has a similar error rate compared with the most advanced NAS algorithms, but the time cost is lower. Finally, the network structure searched on Cifar-10 is transferred directly to the ImageNet dataset, which can achieve 73.6% classification accuracy.
科研通智能强力驱动
Strongly Powered by AbleSci AI