Back to Search Start Over

Pareto-Optimal Progressive Neural Architecture Search

Authors :
Stefano Samele
Matteo Matteucci
Eugenio Lomurno
Danilo Ardagna
Source :
GECCO Companion, Proceedings of the Genetic and Evolutionary Computation Conference Companion
Publication Year :
2021
Publisher :
Association for Computing Machinery, New York, NY, United States, 2021.

Abstract

Neural Architecture Search (NAS) is the process of automating architecture engineering, searching for the best deep learning configuration. One of the main NAS approaches proposed in the literature, Progressive Neural Architecture Search (PNAS), seeks for the architectures with a sequential model-based optimization strategy: it defines a common recursive structure to generate the networks, whose number of building blocks rises through iterations. However, NAS algorithms are generally designed for an ideal setting without considering the needs and the technical constraints imposed by practical applications. In this paper, we propose a new architecture search named Pareto-Optimal Progressive Neural Architecture Search (POPNAS) that combines the benefits of PNAS to a time-accuracy Pareto optimization problem. POPNAS adds a new time predictor to the existing approach to carry out a joint prediction of time and accuracy for each candidate neural network, searching through the Pareto front. This allows us to reach a trade-off between accuracy and training time, identifying neural network architectures with competitive accuracy in the face of a drastically reduced training time.

Details

Language :
English
ISBN :
978-1-4503-8351-6
ISBNs :
9781450383516
Database :
OpenAIRE
Journal :
GECCO Companion, Proceedings of the Genetic and Evolutionary Computation Conference Companion
Accession number :
edsair.doi.dedup.....fea4383d4f60fbb5489d6bea867d878b