Statistical Training for Neuromorphic Computing using Memristor-based Crossbars Considering Process Variations and Noise

Ying Zhu1,a, Grace Li Zhang1,b, Tianchen Wang2,e, Bing Li1,c, Yiyu Shi2,f, Tsung-Yi Ho3 and Ulf Schlichtmann1,d

1Technical University of Munich
aying.zhu@tum.de
bgrace-li.zhang@tum.de
cb.li@tum.de
dulf.schlichtmann@tum.de
2University of Notre Dame
etwang9@nd.edu
fyshi4@nd.edu
3National Tsing Hua University
tyho@cs.nthu.edu.tw

ABSTRACT

Memristor-based crossbars are an attractive platform to accelerate neuromorphic computing. However, process variations during manufacturing and noise in memristors cause significant accuracy loss if not addressed. In this paper, we propose to model process variations and noise as correlated random variables and incorporate them into the cost function during training. Consequently, the weights after this statistical training become more robust and together with global variation compensation provide a stable inference accuracy. Simulation results demonstrate that the mean value and the standard deviation of the inference accuracy can be improved significantly, by even up to 54% and 31%, respectively, in a two-layer fully connected neural network.



Full Text (PDF)