Authors: Ilkin Aliyev, Kama Svoboda, Tosiron Adegbija, Jean-Marc Fellous
Abstract: Spiking Neural Networks (SNNs) are inspired by the sparse and event-driven
nature of biological neural processing, and offer the potential for
ultra-low-power artificial intelligence. However, realizing their efficiency
benefits requires specialized hardware and a co-design approach that
effectively leverages sparsity. We explore the hardware-software co-design of
sparse SNNs, examining how sparsity representation, hardware architectures, and
training techniques influence hardware efficiency. We analyze the impact of
static and dynamic sparsity, discuss the implications of different neuron
models and encoding schemes, and investigate the need for adaptability in
hardware designs. Our work aims to illuminate the path towards embedded
neuromorphic systems that fully exploit the computational advantages of sparse
SNNs.
Source: http://arxiv.org/abs/2408.14437v1