计算机科学
MNIST数据库
修剪
卷积神经网络
人工智能
模式识别(心理学)
滤波器(信号处理)
人工神经网络
水准点(测量)
上下文图像分类
图像(数学)
计算机视觉
农学
生物
大地测量学
地理
作者
Liu Yang,Shiqiao Gu,Chenyang Shen,Xi-Le Zhao,Qinghua Hu
标识
DOI:10.1109/tcsvt.2023.3277689
摘要
Filter pruning is one of the most popular approaches for compressing convolutional neural networks (CNNs). The most critical task in pruning is to evaluate the importance of each convolutional filter, such that the less important filters can be removed while the overall model performance is minimally affected. In each layer, some filters may be linearly dependent on each other, which means that they have replaceable information. Redundant information can be removed without significantly affecting information richness and model performance. In this paper, we propose a novel low-rank guided pruning scheme to obtain skeleton neural networks by alternatively training and pruning CNNs. In each step, training is performed with nuclear-norm regularization to low-rank the filters in each layer, followed by filter pruning to maintain the information richness via the maximally linearly independent subsystem. A novel "smaller-norm-and-linearly-dependent-less-important" pruning criterion is proposed to compress the model. The training and pruning processes can be repeated until the model is fully trained. To investigate the performance, we applied the proposed joint training and pruning scheme to train the CNNs for image classification. We considered three benchmark datasets: MNIST, CIFAR-10 and ILSVRC-2012. The proposed method successfully achieved a higher pruning rate and better classification performance compared to state-of-the-art compression methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI