Back to Search
Start Over
Maximizing Parallel Activation of Word-Lines in MRAM-Based Binary Neural Network Accelerators
- Source :
- IEEE Access, Vol 9, Pp 141961-141969 (2021)
- Publication Year :
- 2021
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2021.
-
Abstract
- Magnetic RAM (MRAM)-based crossbar array has a great potential as a platform for in-memory binary neural network (BNN) computing. However, the number of word-lines that can be activated simultaneously is limited because of the low $I_{H}/I_{L}$ ratio of MRAM, which makes BNNs more vulnerable to the device variation. To address this issue, we propose an algorithm/hardware co-design methodology. First, we choose a promising memristor crossbar array (MCA) structure based on the sensitivity analysis to process variations. Since the selected MCA structure becomes more tolerant to the device variation when the number of 1 in input activation values decreases, we apply an input distribution regularization scheme to reduce the number of 1 in input of BNNs during training. We further improve the robustness against device variation by adopting the retraining scheme based on knowledge distillation. Experimental results show that the proposed method makes BNNs more tolerant to MRAM variation and increases the number of parallel word-line activation significantly; thereby achieving improved throughput and energy efficiency.
- Subjects :
- device variation
Magnetoresistive random-access memory
General Computer Science
Artificial neural network
Computer science
General Engineering
Memristor
Topology
Regularization (mathematics)
TK1-9971
law.invention
in-memory computing
law
Robustness (computer science)
binary neural network
General Materials Science
Electrical engineering. Electronics. Nuclear engineering
Sensitivity (control systems)
Magnetic RAM
Throughput (business)
Word (computer architecture)
Subjects
Details
- ISSN :
- 21693536
- Volume :
- 9
- Database :
- OpenAIRE
- Journal :
- IEEE Access
- Accession number :
- edsair.doi.dedup.....69a2ca8d28cc6b81d5a632253e95471c
- Full Text :
- https://doi.org/10.1109/access.2021.3121011