Robust Binary Neural Network against Noisy Analog Computation

Zong-Han Lee1,a, Fu-Cheng Tsai2 and Shih-Chieh Chang1,b
1Department of Computer Science National Tsing-Hua University Hsinchu, Taiwan
azonghan.l@gapp.nthu.edu.tw
bscchang@cs.nthu.edu.tw
2Electronic and Optoelectronic System Research Laboratories Industrial Technology Research Institute Hsinchu, Taiwan
itriA70513@itri.org.tw

ABSTRACT


Computing in memory (CIM) technology has shown promising results in reducing the energy consumption of a batterypowered device. On the other hand, to reduce MAC operations, Binary neural networks (BNN) show the potential to catch up with a full-precision model. This paper proposes a robust BNN model applied to the CIM framework, which can tolerate analog noises. These analog noises caused by various variations, such as process variation, can lead to low inference accuracy. We first observe that the traditional batch normalization can cause a BNN model to be susceptible to analog noise. We then propose a new approach to replace the batch normalization while maintaining the advantages. Secondly, in BNN, since noises can be removed when inputs are zeros during the multiplication and accumulation (MAC) operation, we also propose novel methods to increase the number of zeros in a convolution output. We apply our new BNN model in the keyword spotting application. Our results are very exciting.

Keywords: Deep Neural Networks, Analog Ai, Noise Tolerance.



Full Text (PDF)