Back to Search Start Over

Accurate deep neural network inference using computational phase-change memory

Authors :
Joshi, Vinay
Gallo, Manuel Le
Haefeli, Simon
Boybat, Irem
Nandakumar, S. R.
Piveteau, Christophe
Dazzi, Martino
Rajendran, Bipin
Sebastian, Abu
Eleftheriou, Evangelos
Source :
Nature Communications 11, Article number: 2473 (2020)
Publication Year :
2019

Abstract

In-memory computing is a promising non-von Neumann approach for making energy-efficient deep learning inference hardware. Crossbar arrays of resistive memory devices can be used to encode the network weights and perform efficient analog matrix-vector multiplications without intermediate movements of data. However, due to device variability and noise, the network needs to be trained in a specific way so that transferring the digitally trained weights to the analog resistive memory devices will not result in significant loss of accuracy. Here, we introduce a methodology to train ResNet-type convolutional neural networks that results in no appreciable accuracy loss when transferring weights to in-memory computing hardware based on phase-change memory (PCM). We also propose a compensation technique that exploits the batch normalization parameters to improve the accuracy retention over time. We achieve a classification accuracy of 93.7% on the CIFAR-10 dataset and a top-1 accuracy on the ImageNet benchmark of 71.6% after mapping the trained weights to PCM. Our hardware results on CIFAR-10 with ResNet-32 demonstrate an accuracy above 93.5% retained over a one day period, where each of the 361,722 synaptic weights of the network is programmed on just two PCM devices organized in a differential configuration.<br />Comment: This is a pre-print of an article accepted for publication in Nature Communications

Details

Database :
arXiv
Journal :
Nature Communications 11, Article number: 2473 (2020)
Publication Type :
Report
Accession number :
edsarx.1906.03138
Document Type :
Working Paper
Full Text :
https://doi.org/10.1038/s41467-020-16108-9