人工神经网络
计算机科学
建筑
人工智能
机器学习
贝叶斯概率
进化算法
随机搜索
贝叶斯优化
图形
搜索算法
理论计算机科学
算法
艺术
视觉艺术
作者
Chen Wei,Chuang Niu,Yiping Tang,Yue Wang,Haihong Hu,Jimin Liang
标识
DOI:10.1109/tnnls.2022.3151160
摘要
Neural architecture search (NAS) adopts a search strategy to explore the predefined search space to find superior architecture with the minimum searching costs. Bayesian optimization (BO) and evolutionary algorithms (EA) are two commonly used search strategies, but they suffer from being computationally expensive, challenging to implement, and exhibiting inefficient exploration ability. In this article, we propose a neural predictor guided EA to enhance the exploration ability of EA for NAS (NPENAS) and design two kinds of neural predictors. The first predictor is a BO acquisition function for which we design a graph-based uncertainty estimation network as the surrogate model. The second predictor is a graph-based neural network that directly predicts the performance of the input neural architecture. The NPENAS using the two neural predictors are denoted as NPENAS-BO and NPENAS-NP, respectively. In addition, we introduce a new random architecture sampling method to overcome the drawbacks of the existing sampling method. Experimental results on five NAS search spaces indicate that NPENAS-BO and NPENAS-NP outperform most existing NAS algorithms, with NPENAS-NP achieving state-of-the-art performance on four of the five search spaces.
科研通智能强力驱动
Strongly Powered by AbleSci AI