正规化(语言学)
计算机科学
计算
规范化(社会学)
算法
人工智能
人类学
社会学
作者
Bo Li,Zongzeng Li,Feng Li
标识
DOI:10.1109/ntci60157.2023.10403747
摘要
Convolutional neural network (CNN) is increasingly known as computation-intensive and memory-intensive models, which hinder their deployment on embedded systems. Attempts have been made to compress CNNs using the pruning method, especially structured pruning, which can achieve a compact network without affecting the original accuracy. Group regularization methods are usually used to achieve structured pruning. However, existing group regularization methods usually use a fixed regularization factor to compress weights towards zero, which has an unreasonable presupposition that all weights are equally important. To solve this issue, we propose a group regularization with dynamic regularization factor by combining regularization-based structured pruning and importance-based structured pruning, named DRFGR. It assigns a different and updatable regularization factor for each filter based on the scaling factors of the Batch Normalization (BN) layer, which represents the importance of the filters. The efficiency of DRFGR has been verified by empirical analysis on the CIFAR10 dataset with popular CNNs. The results show that DRFGR can compress the network more adequately than the method with a fixed regularization factor, while also improving accuracy.
科研通智能强力驱动
Strongly Powered by AbleSci AI