Back to Search Start Over

RNAS-CL: Robust Neural Architecture Search by Cross-Layer Knowledge Distillation

Authors :
Nath, Utkarsh
Wang, Yancheng
Yang, Yingzhen
Publication Year :
2023

Abstract

Deep Neural Networks are vulnerable to adversarial attacks. Neural Architecture Search (NAS), one of the driving tools of deep neural networks, demonstrates superior performance in prediction accuracy in various machine learning applications. However, it is unclear how it performs against adversarial attacks. Given the presence of a robust teacher, it would be interesting to investigate if NAS would produce robust neural architecture by inheriting robustness from the teacher. In this paper, we propose Robust Neural Architecture Search by Cross-Layer Knowledge Distillation (RNAS-CL), a novel NAS algorithm that improves the robustness of NAS by learning from a robust teacher through cross-layer knowledge distillation. Unlike previous knowledge distillation methods that encourage close student/teacher output only in the last layer, RNAS-CL automatically searches for the best teacher layer to supervise each student layer. Experimental result evidences the effectiveness of RNAS-CL and shows that RNAS-CL produces small and robust neural architecture.<br />Comment: 17 pages, 12 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2301.08092
Document Type :
Working Paper
Full Text :
https://doi.org/10.1007/s11263-024-02133-4