失败
卷积神经网络
计算机科学
一般化
钥匙(锁)
特征(语言学)
卷积(计算机科学)
计算
人工智能
工作量
过程(计算)
机器学习
模式识别(心理学)
人工神经网络
算法
并行计算
数学
操作系统
数学分析
哲学
语言学
计算机安全
作者
Yu Liang,Maozhen Li,Changjun Jiang,Guanjun Liu
标识
DOI:10.1109/tnnls.2021.3133127
摘要
Lightweight convolutional neural networks (CNNs) rely heavily on the design of lightweight convolutional modules (LCMs). For an LCM, lightweight design based on repetitive feature maps (LoR) is currently one of the most effective approaches. An LoR mainly involves an extraction of feature maps from convolutional layers (CE) and feature map regeneration through cheap operations (RO). However, existing LoR approaches carry out lightweight improvements only from the aspect of RO but ignore the problems of poor generalization, low stability, and high computation workload incurred in the CE part. To alleviate these problems, this article introduces the concept of key features from a CNN model interpretation perspective. Subsequently, it presents a novel LCM, namely CEModule, focusing on the CE part. CEModule increases the number of key features to maintain a high level of accuracy in classification. In the meantime, CEModule employs a group convolution strategy to reduce floating-point operations (FLOPs) incurred in the training process. Finally, this article brings forth a dynamic adaptation algorithm ( α -DAM) to enhance the generalization of CEModule-enabled lightweight CNN models, including the developed CENet in dealing with datasets of different scales. Compared with the state-of-the-art results, CEModule reduces FLOPs by up to 54% on CIFAR-10 while maintaining a similar level of accuracy in classification. On ImageNet, CENet increases accuracy by 1.2% following the same FLOPs and training strategies.
科研通智能强力驱动
Strongly Powered by AbleSci AI