亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整的填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Automatic Sparse Connectivity Learning for Neural Networks

计算机科学 人工神经网络 修剪 最大值和最小值 算法 二进制数 人工智能 超参数 模式识别(心理学) 机器学习 数学 农学 生物 算术 数学分析
作者
Zhimin Tang,Linkai Luo,Bike Xie,Yiyu Zhu,Rujie Zhao,Lvqing Bi,Chao Lü
出处
期刊:IEEE transactions on neural networks and learning systems [Institute of Electrical and Electronics Engineers]
卷期号:34 (10): 7350-7364 被引量:46
标识
DOI:10.1109/tnnls.2022.3141665
摘要

Since sparse neural networks usually contain many zero weights, these unnecessary network connections can potentially be eliminated without degrading network performance. Therefore, well-designed sparse neural networks have the potential to significantly reduce the number of floating-point operations (FLOPs) and computational resources. In this work, we propose a new automatic pruning method-sparse connectivity learning (SCL). Specifically, a weight is reparameterized as an elementwise multiplication of a trainable weight variable and a binary mask. Thus, network connectivity is fully described by the binary mask, which is modulated by a unit step function. We theoretically prove the fundamental principle of using a straight-through estimator (STE) for network pruning. This principle is that the proxy gradients of STE should be positive, ensuring that mask variables converge at their minima. After finding Leaky ReLU, Softplus, and identity STEs can satisfy this principle, we propose to adopt identity STE in SCL for discrete mask relaxation. We find that mask gradients of different features are very unbalanced; hence, we propose to normalize mask gradients of each feature to optimize mask variable training. In order to automatically train sparse masks, we include the total number of network connections as a regularization term in our objective function. As SCL does not require pruning criteria or hyperparameters defined by designers for network layers, the network is explored in a larger hypothesis space to achieve optimized sparse connectivity for the best performance. SCL overcomes the limitations of existing automatic pruning methods. Experimental results demonstrate that SCL can automatically learn and select important network connections for various baseline network structures. Deep learning models trained by SCL outperform the state-of-the-art human-designed and automatic pruning methods in sparsity, accuracy, and FLOPs reduction.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
科研通AI2S应助杨安安采纳,获得10
10秒前
andrele完成签到,获得积分10
11秒前
Limerencia完成签到,获得积分10
12秒前
12秒前
无限寻雪完成签到 ,获得积分10
15秒前
ranj完成签到,获得积分10
19秒前
无花果应助肆陆采纳,获得10
21秒前
123发布了新的文献求助10
31秒前
科研助手6应助z123采纳,获得10
38秒前
斯文败类应助科研通管家采纳,获得10
39秒前
chenjzhuc应助科研通管家采纳,获得10
39秒前
无花果应助科研通管家采纳,获得10
40秒前
科研通AI2S应助科研通管家采纳,获得10
40秒前
123完成签到 ,获得积分10
42秒前
44秒前
张怡博完成签到 ,获得积分10
46秒前
59秒前
wafo完成签到,获得积分10
59秒前
星河完成签到,获得积分10
1分钟前
立夏完成签到,获得积分10
1分钟前
科研通AI5应助cookie采纳,获得10
1分钟前
1分钟前
野性的雨发布了新的文献求助10
1分钟前
1分钟前
1分钟前
suyi完成签到,获得积分10
1分钟前
萧寒发布了新的文献求助10
1分钟前
肆陆发布了新的文献求助10
1分钟前
suyi发布了新的文献求助10
1分钟前
野性的雨关注了科研通微信公众号
1分钟前
666666666666666完成签到 ,获得积分10
1分钟前
萧寒完成签到,获得积分10
1分钟前
1分钟前
雅典的宠儿完成签到 ,获得积分10
1分钟前
刘啊啊啊发布了新的文献求助10
1分钟前
dax大雄完成签到 ,获得积分10
1分钟前
mm完成签到 ,获得积分10
2分钟前
江夏完成签到 ,获得积分10
2分钟前
2分钟前
刘啊啊啊完成签到,获得积分20
2分钟前
高分求助中
Encyclopedia of Mathematical Physics 2nd edition 888
Chinesen in Europa – Europäer in China: Journalisten, Spione, Studenten 500
Arthur Ewert: A Life for the Comintern 500
China's Relations With Japan 1945-83: The Role of Liao Chengzhi // Kurt Werner Radtke 500
Two Years in Peking 1965-1966: Book 1: Living and Teaching in Mao's China // Reginald Hunt 500
材料概论 周达飞 ppt 500
Nonrandom distribution of the endogenous retroviral regulatory elements HERV-K LTR on human chromosome 22 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3807998
求助须知:如何正确求助?哪些是违规求助? 3352680
关于积分的说明 10359930
捐赠科研通 3068677
什么是DOI,文献DOI怎么找? 1685216
邀请新用户注册赠送积分活动 810332
科研通“疑难数据库(出版商)”最低求助积分说明 766022