An Automatic Neural Network Architecture-and-Quantization Joint Optimization Framework for Efficient Model Inference

量化(信号处理) 计算机科学 人工神经网络 推论 人工智能 机器学习 算法
作者
Lian Liu,Ying Wang,Xiandong Zhao,Weiwei Chen,Huawei Li,Xiaowei Li,Yinhe Han
出处
期刊:IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems [Institute of Electrical and Electronics Engineers]
卷期号:43 (5): 1497-1510
标识
DOI:10.1109/tcad.2023.3339438
摘要

Efficient deep learning models, especially optimized for edge devices, benefit from low inference latency to efficient energy consumption. Two classical techniques for efficient model inference are lightweight neural architecture search (NAS), which automatically designs compact network models, and quantization, which reduces the bit-precision of neural network models. As a consequence, joint design for both neural architecture and quantization precision settings is becoming increasingly popular. There are three main aspects that affect the performance of the joint optimization between neural architecture and quantization: quantization precision selection (QPS), quantization aware training (QAT), and neural architecture searching (NAS). However, existing works focus on at most twofold of these aspects, and result in secondary performance. To this end, we proposed a novel automatic optimization framework, DAQUDAQU is an ancient liquor fermentation process., that allows jointly searching for Pareto-optimal neural architecture and quantization precision combination among more than 1047 quantized subnet models. To overcome the instability of the conventional automatic optimization framework, DAQU incorporates a warm-up strategy to reduce the accuracy gap among different neural architectures, and a precision-transfer training approach to maintain flexibility among different quantization precision settings. Our experiments show that the quantized lightweight neural networks generated by DAQU consistently outperform state-of-the-art NAS and quantization joint optimization methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
哈里鹿呀发布了新的文献求助10
1秒前
1秒前
大模型应助mmy采纳,获得10
2秒前
星辰大海应助开放觅夏采纳,获得10
2秒前
ty发布了新的文献求助10
3秒前
爱笑的鹿发布了新的文献求助10
3秒前
3秒前
Timezzz发布了新的文献求助30
4秒前
研友_ZzrwqZ发布了新的文献求助10
4秒前
想毕业完成签到,获得积分10
5秒前
5秒前
曾蕙茹关注了科研通微信公众号
5秒前
龙欣完成签到,获得积分10
6秒前
6秒前
peace发布了新的文献求助30
6秒前
所所应助生动友绿采纳,获得10
7秒前
7秒前
donal发布了新的文献求助10
8秒前
MutantKitten完成签到,获得积分10
8秒前
9秒前
HEANZ完成签到,获得积分10
9秒前
哈里鹿呀完成签到,获得积分10
9秒前
9秒前
10秒前
爱笑的鹿完成签到,获得积分10
10秒前
10秒前
10秒前
小门发布了新的文献求助10
10秒前
wanci应助一篇吃不饱采纳,获得10
10秒前
ATTENTION完成签到,获得积分10
13秒前
14秒前
14秒前
LongH2完成签到,获得积分10
14秒前
Kessenn发布了新的文献求助10
15秒前
15秒前
MutantKitten发布了新的文献求助10
15秒前
15秒前
15秒前
jam发布了新的文献求助10
15秒前
高分求助中
【请各位用户详细阅读此贴后再求助】科研通的精品贴汇总(请勿应助) 10000
The Mother of All Tableaux: Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 3000
Global Eyelash Assessment scale (GEA) 1000
Maritime Applications of Prolonged Casualty Care: Drowning and Hypothermia on an Amphibious Warship 500
Comparison analysis of Apple face ID in iPad Pro 13” with first use of metasurfaces for diffraction vs. iPhone 16 Pro 500
Towards a $2B optical metasurfaces opportunity by 2029: a cornerstone for augmented reality, an incremental innovation for imaging (YINTR24441) 500
Materials for Green Hydrogen Production 2026-2036: Technologies, Players, Forecasts 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 4052572
求助须知:如何正确求助?哪些是违规求助? 3590869
关于积分的说明 11411535
捐赠科研通 3317165
什么是DOI,文献DOI怎么找? 1824571
邀请新用户注册赠送积分活动 896170
科研通“疑难数据库(出版商)”最低求助积分说明 817311