Title Contrastive Neural Architecture Search with Neural Architecture Comparators
Authors Chen, Yaofo
Guo, Yong
Chen, Qi
Li, Minli
Zeng, Wei
Wang, Yaowei
Tan, Mingkui
Affiliation South China Univ Technol, Guangzhou, Peoples R China
Peng Cheng Lab, Shenzhen, Peoples R China
Peking Univ, Beijing, Peoples R China
Minist Educ, Key Lab Big Data & Intelligent Robot, Beijing, Peoples R China
Issue Date 2021
Publisher 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021
Abstract One of the key steps in Neural Architecture Search (NAS) is to estimate the performance of candidate architectures. Existing methods either directly use the validation performance or learn a predictor to estimate the performance. However, these methods can be either computationally expensive or very inaccurate, which may severely affect the search efficiency and performance. Moreover, as it is very difficult to annotate architectures with accurate performance on specific tasks, learning a promising performance predictor is often non-trivial due to the lack of labeled data. In this paper, we argue that it may not be necessary to estimate the absolute performance for NAS. On the contrary, we may need only to understand whether an architecture is better than a baseline one. However, how to exploit this comparison information as the reward and how to well use the limited labeled data remains two great challenges. In this paper, we propose a novel Contrastive Neural Architecture Search (CTNAS) method which performs architecture search by taking the comparison results between architectures as the reward. Specifically, we design and learn a Neural Architecture Comparator (NAC) to compute the probability of candidate architectures being better than a baseline one. Moreover, we present a baseline updating scheme to improve the baseline iteratively in a curriculum learning manner. More critically, we theoretically show that learning NAC is equivalent to optimizing the ranking over architectures. Extensive experiments in three search spaces demonstrate the superiority of our CTNAS over existing methods.
URI http://hdl.handle.net/20.500.11897/636857
ISBN 978-1-6654-4509-2
ISSN 1063-6919
DOI 10.1109/CVPR46437.2021.00938
Indexed EI
CPCI-S(ISTP)
Appears in Collections: 待认领

Files in This Work
There are no files associated with this item.

Web of Science®


0

Checked on Last Week

Scopus®



Checked on Current Time

百度学术™


0

Checked on Current Time

Google Scholar™





License: See PKU IR operational policies.