Back to Search
Start Over
Maximizing Parallel Activation of Word-Lines in MRAM-Based Binary Neural Network Accelerators
- Source :
- IEEE Access, Vol 9, Pp 141961-141969 (2021)
- Publication Year :
- 2021
- Publisher :
- IEEE, 2021.
-
Abstract
- Magnetic RAM (MRAM)-based crossbar array has a great potential as a platform for in-memory binary neural network (BNN) computing. However, the number of word-lines that can be activated simultaneously is limited because of the low $I_{H}/I_{L}$ ratio of MRAM, which makes BNNs more vulnerable to the device variation. To address this issue, we propose an algorithm/hardware co-design methodology. First, we choose a promising memristor crossbar array (MCA) structure based on the sensitivity analysis to process variations. Since the selected MCA structure becomes more tolerant to the device variation when the number of 1 in input activation values decreases, we apply an input distribution regularization scheme to reduce the number of 1 in input of BNNs during training. We further improve the robustness against device variation by adopting the retraining scheme based on knowledge distillation. Experimental results show that the proposed method makes BNNs more tolerant to MRAM variation and increases the number of parallel word-line activation significantly; thereby achieving improved throughput and energy efficiency.
Details
- Language :
- English
- ISSN :
- 21693536
- Volume :
- 9
- Database :
- Directory of Open Access Journals
- Journal :
- IEEE Access
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.316a41a2612480b885733c178414d2d
- Document Type :
- article
- Full Text :
- https://doi.org/10.1109/ACCESS.2021.3121011