Back to Search Start Over

Enabling Secure NVM-Based in-Memory Neural Network Computing by Sparse Fast Gradient Encryption.

Authors :
Cai, Yi
Chen, Xiaoming
Tian, Lu
Wang, Yu
Yang, Huazhong
Source :
IEEE Transactions on Computers. Nov2020, Vol. 69 Issue 11, p1596-1610. 15p.
Publication Year :
2020

Abstract

Neural network (NN) computing is energy-consuming on traditional computing systems, owing to the inherent memory wall bottleneck of the von Neumann architecture and the Moore's Law being approaching the end. Non-volatile memories (NVMs) have been demonstrated as promising alternatives for constructing computing-in-memory (CIM) systems to accelerate NN computing. However, NVM-based NN computing systems are vulnerable to the confidentiality attacks because the weight parameters persist in memory when the system is powered off, enabling an adversary with physical access to extract the well-trained NN models. The goal of this article is to find a solution for thwarting the confidentiality attacks. We define and model the weight encryption problem. Then we propose an effective framework, containing a sparse fast gradient encryption (SFGE) method and a runtime encryption scheduling (RES) scheme, to guarantee the confidentiality security of NN models with a negligible performance overhead. Moreover, we improve the SFGE method by incrementally generating the encryption keys. Additionally, we provide variants of the encryption method to better fit quantized models and various mapping strategies. The experiments demonstrate that only encrypting an extremely small proportion of the weights (e.g., 20 weights per layer in ResNet-101), the NN models can be strictly protected. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189340
Volume :
69
Issue :
11
Database :
Academic Search Index
Journal :
IEEE Transactions on Computers
Publication Type :
Academic Journal
Accession number :
146359112
Full Text :
https://doi.org/10.1109/TC.2020.3017870