Approximate Computation of Post-Synaptic Spikes Reduces Bandwidth to Synaptic Storage in A Model of Cortex

Dimitrios Stathis1,a, Yu Yang1,b, Ahmed Hemani1,c and Anders Lansner2
1KTH Royal Institute of Technology, Stockholm, Sweden
astathis@kth.se
byuyang2@kth.se
chemani@kth.se
2Stockholm University and KTH Royal Institute of Technology, Sweden
ala@kth.se

ABSTRACT


The Bayesian Confidence Propagation Neural Network (BCPNN) is a spiking model of the cortex. The synaptic weights of BCPNN are organized as matrices. They require substantial synaptic storage and a large bandwidth to it. The algorithm requires a dual access pattern to these matrices, both row-wise and column-wise, to access its synaptic weights. In this work, we exploit an algorithmic optimization that eliminates the column-wise accesses. The new computation model approximates the post-synaptic spikes computation with the use of a predictor. We have adopted this approximate computational model to improve upon the previously reported ASIC implementation, called eBrainII. We also present the error analysis of the approximation to show that it is negligible. The reduction in storage and bandwidth to the synaptic storage results in a 48% reduction in energy compared to eBrainII. The reported approximation method also applies to other neural network models based on a Hebbian learning rule.

Keywords: Approximate Computing, Neuromorphic Hardware, ASIC, 3D DRAM, Bandwidth Optimization.



Full Text (PDF)