计算机科学
深度学习
软件部署
人工智能
机器学习
领域(数学)
数据科学
延迟(音频)
软件工程
电信
数学
纯数学
出处
期刊:ACM Computing Surveys
[Association for Computing Machinery]
日期:2023-01-20
卷期号:55 (12): 1-37
被引量:235
摘要
Deep learning has revolutionized the fields of computer vision, natural language understanding, speech recognition, information retrieval, and more. However, with the progressive improvements in deep learning models, their number of parameters, latency, and resources required to train, among others, have all increased significantly. Consequently, it has become important to pay attention to these footprint metrics of a model as well, not just its quality. We present and motivate the problem of efficiency in deep learning, followed by a thorough survey of the five core areas of model efficiency (spanning modeling techniques, infrastructure, and hardware) and the seminal work there. We also present an experiment-based guide along with code for practitioners to optimize their model training and deployment. We believe this is the first comprehensive survey in the efficient deep learning space that covers the landscape of model efficiency from modeling techniques to hardware support. It is our hope that this survey would provide readers with the mental model and the necessary understanding of the field to apply generic efficiency techniques to immediately get significant improvements, and also equip them with ideas for further research and experimentation to achieve additional gains.
科研通智能强力驱动
Strongly Powered by AbleSci AI