EPNAS: Efficient Progressive Neural Architecture Search

Yanqi Zhou (Google), Peng Wang (Baidu USA LLC.)

Abstract
In this paper, we propose Efficient Progressive Neural Architecture Search (EPNAS), a neural architecture search (NAS) framework that efficiently handles large search space through a novel progressive search policy with performance prediction based on REINFORCE. EPNAS is designed to search target networks in parallel, which is more scalable on parallel platforms. More importantly, EPNAS can be generalized to architecture search with multiple resource constraints, \eg, model size, compute complexity or intensity, which is crucial for deployment in widespread platforms such as mobile and cloud. We compare EPNAS against other state-of-the-art (SOTA) network architectures (\eg, MobileNetV2) and efficient NAS algorithms (\eg, ENAS, and PNAS) on image recognition tasks using CIFAR10 and ImageNet. On both datasets, EPNAS is superior \wrt architecture searching speed and recognition accuracy.

DOI
10.5244/C.33.92
https://dx.doi.org/10.5244/C.33.92

Files
Paper (PDF)
Supplementary material (PDF)

BibTeX
@inproceedings{BMVC2019,
title={EPNAS: Efficient Progressive Neural Architecture Search},
author={Yanqi Zhou and Peng Wang},
year={2019},
month={September},
pages={92.1--92.13},
articleno={92},
numpages={13},
booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
publisher={BMVA Press},
editor={Kirill Sidorov and Yulia Hicks},
doi={10.5244/C.33.92},
url={https://dx.doi.org/10.5244/C.33.92}
}