计算机科学
建筑
人工智能
深度学习
强化学习
人工神经网络
多样性(控制论)
任务(项目管理)
机器学习
分类学(生物学)
过程(计算)
加速
数据科学
工程类
程序设计语言
地理
植物
考古
系统工程
生物
操作系统
作者
Colin White,Mahmoud Safari,Rhea Sanjay Sukthanker,Binxin Ru,Thomas Elsken,Arber Zela,Debadeepta Dey,Frank Hutter
出处
期刊:Cornell University - arXiv
日期:2023-01-01
被引量:40
标识
DOI:10.48550/arxiv.2301.08727
摘要
In the past decade, advances in deep learning have resulted in breakthroughs in a variety of areas, including computer vision, natural language understanding, speech recognition, and reinforcement learning. Specialized, high-performing neural architectures are crucial to the success of deep learning in these areas. Neural architecture search (NAS), the process of automating the design of neural architectures for a given task, is an inevitable next step in automating machine learning and has already outpaced the best human-designed architectures on many tasks. In the past few years, research in NAS has been progressing rapidly, with over 1000 papers released since 2020 (Deng and Lindauer, 2021). In this survey, we provide an organized and comprehensive guide to neural architecture search. We give a taxonomy of search spaces, algorithms, and speedup techniques, and we discuss resources such as benchmarks, best practices, other surveys, and open-source libraries.
科研通智能强力驱动
Strongly Powered by AbleSci AI