Back to Search Start Over

Stabilizing DARTS with Amended Gradient Estimation on Architectural Parameters

Authors :
Bi, Kaifeng
Hu, Changping
Xie, Lingxi
Chen, Xin
Wei, Longhui
Tian, Qi
Publication Year :
2019

Abstract

DARTS is a popular algorithm for neural architecture search (NAS). Despite its great advantage in search efficiency, DARTS often suffers weak stability, which reflects in the large variation among individual trials as well as the sensitivity to the hyper-parameters of the search process. This paper owes such instability to an optimization gap between the super-network and its sub-networks, namely, improving the validation accuracy of the super-network does not necessarily lead to a higher expectation on the performance of the sampled sub-networks. Then, we point out that the gap is due to the inaccurate estimation of the architectural gradients, based on which we propose an amended estimation method. Mathematically, our method guarantees a bounded error from the true gradients while the original estimation does not. Our approach bridges the gap from two aspects, namely, amending the estimation on the architectural gradients, and unifying the hyper-parameter settings in the search and re-training stages. Experiments on CIFAR10 and ImageNet demonstrate that our approach largely improves search stability and, more importantly, enables DARTS-based approaches to explore much larger search spaces that have not been investigated before.<br />Comment: 22 pages, 12 figures, submitted to ICML 2020, updated experiments on Penn Treebank

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1910.11831
Document Type :
Working Paper