Two-Stage Evolutionary Neural Architecture Search for Transfer Learning

计算机科学 人工智能 学习迁移 卷积神经网络 进化算法 子网 机器学习 任务(项目管理) 人工神经网络 深度学习 网络体系结构 进化计算 计算机安全 管理 经济
作者
Yu-Wei Wen,Sheng-Hsuan Peng,Chuan-Kang Ting
出处
期刊:IEEE Transactions on Evolutionary Computation [Institute of Electrical and Electronics Engineers]
卷期号:25 (5): 928-940 被引量:34
标识
DOI:10.1109/tevc.2021.3097937
摘要

Convolutional neural networks (CNNs) have achieved state-of-the-art performance in many image classification tasks. However, training a deep CNN requires a massive amount of training data, which can be expensive or unobtainable in practical applications, such as defect inspection and medical diagnosis. Transfer learning has been developed to address this issue by transferring knowledge learned from source domains to target domains. A common approach is fine-tuning, which adapts the parameters of a trained neural network for the new target task. Nevertheless, the network architecture remains designed for the source task rather than the target task. To optimize the network architecture in transfer learning, we propose a two-stage evolutionary neural architecture search for transfer learning (EvoNAS-TL), which searches for an efficient subnetwork of the source model for the target task. EvoNAS-TL features two search stages: 1) structure search and 2) local enhancement. The former conducts a coarse-grained global search for suitable neural architectures, while the latter acts as a fine-grained local search to refine the models obtained. In this study, neural architecture search (NAS) is formulated as a multiobjective optimization problem that concurrently minimizes the prediction error and model size. The knee-guided multiobjective evolutionary algorithm, a modern multiobjective optimization approach, is employed to solve the NAS problem. In this study, several experiments are conducted to examine the effectiveness of EvoNAS-TL. The results show that applying EvoNAS-TL on VGG-16 can reduce the model size by 52%–85% and simultaneously improve the testing accuracy by 0.7%–6.9% in transferring from ImageNet to CIFAR-10 and NEU surface detection datasets. In addition, EvoNAS-TL performs comparably to or better than state-of-the-art methods on the CIFAR-10, NEU, and Office-31 datasets.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
Akim应助周以筠采纳,获得10
1秒前
1秒前
朱问晴发布了新的文献求助10
1秒前
坚强的笑天完成签到,获得积分10
2秒前
ZW完成签到,获得积分10
2秒前
2秒前
3秒前
3秒前
4秒前
6秒前
8秒前
一二三四发布了新的文献求助10
9秒前
认真的坤发布了新的文献求助10
9秒前
Twonej举报Zer0求助涉嫌违规
9秒前
图图发布了新的文献求助10
10秒前
研友_LaV1xn发布了新的文献求助10
10秒前
JamesPei应助坚强的孤容采纳,获得10
10秒前
Jun发布了新的文献求助10
10秒前
11秒前
华仔应助keke采纳,获得20
11秒前
11秒前
11秒前
13秒前
Jasper应助lty001采纳,获得10
13秒前
爆米花应助钟沐晨采纳,获得10
15秒前
Lucy发布了新的文献求助10
16秒前
勤奋酒窝完成签到,获得积分10
16秒前
单薄的绾绾完成签到,获得积分10
16秒前
莎莎完成签到 ,获得积分10
18秒前
19秒前
善学以致用应助土豆丝P采纳,获得10
19秒前
19秒前
所所应助slby采纳,获得10
19秒前
Jupiter完成签到,获得积分10
19秒前
JamesPei应助WWW采纳,获得10
20秒前
20秒前
21秒前
Twonej应助aming采纳,获得30
22秒前
lty001完成签到,获得积分10
22秒前
无敌小汐发布了新的文献求助30
22秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Encyclopedia of Reproduction Third Edition 3000
Comprehensive Methanol Science Production, Applications, and Emerging Technologies 2000
化妆品原料学 1000
Psychology of Self-Regulation 600
1st Edition Sports Rehabilitation and Training Multidisciplinary Perspectives By Richard Moss, Adam Gledhill 600
Qualitative Data Analysis with NVivo By Jenine Beekhuyzen, Pat Bazeley · 2024 500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5638462
求助须知:如何正确求助?哪些是违规求助? 4745784
关于积分的说明 15002777
捐赠科研通 4796555
什么是DOI,文献DOI怎么找? 2562788
邀请新用户注册赠送积分活动 1522106
关于科研通互助平台的介绍 1481918