Back to Search Start Over

EPC-DARTS: Efficient partial channel connection for differentiable architecture search.

Authors :
Cai, Zicheng
Chen, Lei
Liu, Hai-Lin
Source :
Neural Networks. Sep2023, Vol. 166, p344-353. 10p.
Publication Year :
2023

Abstract

With weight-sharing and continuous relaxation strategies, the differentiable architecture search (DARTS) proposes a fast and effective solution to perform neural network architecture search in various deep learning tasks. However, unresolved issues, such as the inefficient memory utilization, and the poor stability of the search architecture due to channels randomly selected, which has even caused performance collapses, are still perplexing researchers and practitioners. In this paper, a novel efficient channel attention mechanism based on partial channel connection for differentiable neural architecture search, termed EPC-DARTS, is proposed to address these two issues. Specifically, we design an efficient channel attention module, which is applied to capture cross-channel interactions and assign weight based on channel importance, to dramatically improve search efficiency and reduce memory occupation. Moreover, only partial channels with higher weights in the mixed calculation of operation are used through the efficient channel attention mechanism, and thus unstable network architectures obtained by the random selection operation can also be avoided in the proposed EPC-DARTS. Experimental results show that the proposed EPC-DARTS achieves remarkably competitive performance (CIFAR-10/CIFAR-100: a test accuracy rate of 97.60%/84.02%), compared to other state-of-the-art NAS methods using only 0.2 GPU-Days. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
*DEEP learning
*MEMORY

Details

Language :
English
ISSN :
08936080
Volume :
166
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
171586341
Full Text :
https://doi.org/10.1016/j.neunet.2023.07.029