排名(信息检索)
边距(机器学习)
一致性(知识库)
计算机科学
人工智能
蒸馏
机器学习
弹丸
人工神经网络
数据挖掘
化学
有机化学
作者
Peijie Dong,Xiamu Niu,Lujun Li,Zhilong Tian,Xiaodong Wang,Zimian Wei,Hengyue Pan,Dongsheng Li
标识
DOI:10.1109/icassp49357.2023.10094709
摘要
Neural architecture search (NAS) has made tremendous progress in the automatic design of effective neural network structures but suffers from a heavy computational burden. One-shot NAS significantly alleviates the burden through weight sharing and improves computational efficiency. Zero-shot NAS further reduces the cost by predicting the performance of the network from its initial state, which conducts no training. Both methods aim to distinguish between "good" and "bad" architectures, i.e., ranking consistency of predicted and true performance. In this paper, we propose Ranking Distillation one-shot NAS (RD-NAS) to enhance ranking consistency, which utilizes zero-cost proxies as the cheap teacher and adopts the margin ranking loss to distill the ranking knowledge. Specifically, we propose a margin subnet sampler to distill the ranking knowledge from zero-shot NAS to one-shot NAS by introducing Group distance as margin. Our evaluation of the NAS-Bench-201 and ResNet-based search space demonstrates that RD-NAS achieve 10.7% and 9.65% improvements in ranking ability, respectively. Our codes are available at https://github.com/pprp/CVPR2022-NAS-competition-Track1-3th-solution
科研通智能强力驱动
Strongly Powered by AbleSci AI