An Efficient Programming Framework for Memristor-based Neuromorphic Computing
Grace Li Zhang1,a, Bing Li1,b, Xing Huang1,c, Chen Shen1,d, Shuhang Zhang1,e, Florin Burcea1,f, Helmut Graeb1,g, Tsung-Yi Ho2,h, Hai (Helen) Li3,i and Ulf Schlichtmann1,j
1Technical University of Munich,
2National Tsing Hua University
3Duke University
agrace-li.zhang@tum.de
bb.li@tum.de
cxing.huang@tum.de
dchen.shen@tum.de
eshuhang.zhang@tum.de
fflorin.burcea@tum.de
ghelmut.graeb@tum.de
hulf.schlichtmann@tum.de
ityho@cs.nthu.edu.tw
jhai.li@duke.edu
ABSTRACT
Memristor-based crossbars are considered to be promising candidates to accelerate vector-matrix computation in deep neural networks. Before being applied for inference, memristors in the crossbars should be programmed to conductances corresponding to the network weights after software training. Existing programming methods, however, adjust conductances of memristors individually with many programming-reading cycles. In this paper, we propose an efficient programming framework for memristor crossbars, where the programming process is partitioned into the predictive phase and the fine-tuning phase. In the predictive phase, multiple memristors are programmed simultaneously with a memristor programming model and IRdrop estimation. To deal with the programming inaccuracy resulting from process variations, noise and IR-drop and move conductances to target values, memristors are fine-tuned afterwards to reach a specified programming accuracy. Simulation results demonstrate that the proposed method can reduce the number of programming-reading cycles by up to 94.77% and 90.61% compared to existing one-by-one and row-by-row programming methods, respectively.
Keywords: Memristor, Neuromorphic Computing, Programming, Ir-Drop, Process Variations, Noise