计算机科学
共享资源
资源(消歧)
计算机安全
计算机网络
作者
Jiahua Huang,Weiwei Lin,Wentai Wu,Yang Wang,Haocheng Zhong,Xinhua Wang,Keqin Li
摘要
The effective and efficient utilization of AI accelerators represents a critical issue for the practitioners engaged in the field of deep learning. Practical evidence from companies such as Alibaba, SenseTime, and Microsoft reveals that the utilization of production GPU clusters in the industry is generally between 25% and 50%. This indicates a significant opportunity for improvement. To this end, AI accelerator resource sharing has emerged as a promising approach to the performance optimization of multi-tenant clusters. This survey covers this line of studies from 2016 to 2024, focusing primarily on system efficiency while also including discussion on fairness, interference, and security in AI accelerator sharing. We revisit the fundamentals and key concepts, followed by a comprehensive review of recent advances in the field. We find that over 70% of the studies focus on efficiency improvement. We also observe that approximately half of the reviewed studies have made their source code publicly available, while fewer than one-third of the studies did not utilize a physical machine for experimentation. Finally, based on the limitations of existing research, we outline several directions for future research concerning the integration of sharing with large language models (LLMs), coordination between schedulers and application-layer metrics, and collaboration among heterogeneous accelerators.
科研通智能强力驱动
Strongly Powered by AbleSci AI