A Survey on Deep Neural Network Pruning: Taxonomy, Comparison, Analysis, and Recommendations

人工智能 计算机科学 分类学(生物学) 人工神经网络 机器学习 模式识别(心理学) 生物 植物
作者
Hongrong Cheng,Miao Zhang,Qinfeng Shi
出处
期刊:IEEE Transactions on Pattern Analysis and Machine Intelligence [Institute of Electrical and Electronics Engineers]
卷期号:46 (12): 10558-10578 被引量:217
标识
DOI:10.1109/tpami.2024.3447085
摘要

Modern deep neural networks, particularly recent large language models, come with massive model sizes that require significant computational and storage resources. To enable the deployment of modern models on resource-constrained environments and to accelerate inference time, researchers have increasingly explored pruning techniques as a popular research direction in neural network compression. More than three thousand pruning papers have been published from 2020 to 2024. However, there is a dearth of up-to-date comprehensive review papers on pruning. To address this issue, in this survey, we provide a comprehensive review of existing research works on deep neural network pruning in a taxonomy of 1) universal/specific speedup, 2) when to prune, 3) how to prune, and 4) fusion of pruning and other compression techniques. We then provide a thorough comparative analysis of eight pairs of contrast settings for pruning (e.g., unstructured/structured, one-shot/iterative, data-free/data-driven, initialized/pre-trained weights, etc.) and explore several emerging topics, including pruning for large language models, vision transformers, diffusion models, and large multimodal models, post-training pruning, and different levels of supervision for pruning to shed light on the commonalities and differences of existing methods and lay the foundation for further method development. Finally, we provide some valuable recommendations on selecting pruning methods and prospect several promising research directions for neural network pruning. To facilitate future research on deep neural network pruning, we summarize broad pruning applications (e.g., adversarial robustness, natural language understanding, etc.) and build a curated collection of datasets, networks, and evaluations on different applications. We maintain a repository on https://github.com/hrcheng1066/awesome-pruning that serves as a comprehensive resource for neural network pruning papers and corresponding open-source codes. We will keep updating this repository to include the latest advancements in the field.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
wen完成签到,获得积分20
1秒前
Akim应助阿正嗖啪采纳,获得10
1秒前
1秒前
科研通AI6应助CHEN采纳,获得10
1秒前
NexusExplorer应助黎L采纳,获得10
1秒前
乐乐应助zttr1采纳,获得10
2秒前
fansaiwang完成签到,获得积分10
2秒前
洪东智完成签到,获得积分10
2秒前
Hello应助路绪震采纳,获得10
3秒前
3秒前
3秒前
4秒前
小白完成签到,获得积分10
4秒前
狂野雨兰发布了新的文献求助10
5秒前
hh发布了新的文献求助10
5秒前
CarryLJR完成签到,获得积分10
7秒前
Dean应助Michael采纳,获得110
7秒前
Sunday发布了新的文献求助10
7秒前
9秒前
10秒前
Owen应助小可采纳,获得10
11秒前
鳗鱼芷巧发布了新的文献求助10
11秒前
占易形完成签到,获得积分10
13秒前
嘟嘟嘟嘟完成签到 ,获得积分10
13秒前
李健的小迷弟应助lmc采纳,获得10
13秒前
wffff完成签到,获得积分10
14秒前
14秒前
14秒前
薛璞发布了新的文献求助10
14秒前
大个应助kai采纳,获得10
15秒前
乐乐应助sssssnape采纳,获得10
15秒前
cheng发布了新的文献求助10
15秒前
李健的小迷弟应助CarryLJR采纳,获得10
15秒前
16秒前
TN完成签到 ,获得积分10
17秒前
量子星尘发布了新的文献求助10
17秒前
17秒前
cc发布了新的文献求助10
18秒前
所所应助合适傲白采纳,获得10
18秒前
徒然草发布了新的文献求助10
19秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Encyclopedia of Reproduction Third Edition 3000
Comprehensive Methanol Science Production, Applications, and Emerging Technologies 2000
化妆品原料学 1000
1st Edition Sports Rehabilitation and Training Multidisciplinary Perspectives By Richard Moss, Adam Gledhill 600
小学科学课程与教学 500
Study and Interlaboratory Validation of Simultaneous LC-MS/MS Method for Food Allergens Using Model Processed Foods 500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5643294
求助须知:如何正确求助?哪些是违规求助? 4760914
关于积分的说明 15020418
捐赠科研通 4801640
什么是DOI,文献DOI怎么找? 2566917
邀请新用户注册赠送积分活动 1524783
关于科研通互助平台的介绍 1484355