Exploring Spike-Based Learning for Neuromorphic Computing: Prospects and Perspectives

Nitin Rathia, Amogh Agrawalb, Chankyu Leec, Adarsh Kumar Kostad and Kaushik Roye
School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN, USA
arathi2@purdue.edu
bagrawa64@purdue.edu
clee2216@purdue.edu
dakosta@purdue.edu
ekaushik@purdue.edu

ABSTRACT


Spiking neural networks (SNNs) operating with sparse binary signals (spikes) implemented on event-driven hardware can potentially be more energy-efficient than traditional artificial neural networks (ANNs). However, SNNs perform computations over time, and the neuron activation function does not have a well-defined derivative leading to unique training challenges. In this paper, we discuss the various spike representations and training mechanisms for deep SNNs. Additionally, we review applications that go beyond classification, like gesture recognition, motion estimation, and sequential learning. The unique features of SNNs, such as high activation sparsity and spike-based computations, can be leveraged in hardware implementations for energy-efficient processing. To that effect, we discuss various SNN implementations, both using digital ASICs as well as analog inmemory computing primitives. Finally, we present an outlook on future applications and open research areas for both SNN algorithms and hardware implementations.

Keywords: Spiking Neural Networks, Event Cameras, Spiking Backpropagation, Liquid State Machine, In-Memory Computing.



Full Text (PDF)