Adapting unconstrained spiking neural networks to explore the effects of time discretization on network properties

Correlation between step size and accuracy for real world task

More Info
expand_more

Abstract

Spiking Neural Networks (SNN) represent a distinct class of neural network models that incorporate an additional temporal dimension. Neurons within SNN operate according to the Leaky Integrate-and-Fire principle, governed by ordinary differential equations. Inter-layer neuronal communication occurs through spike propagation when membrane potentials reach a specified threshold. This unique mechanism renders conventional Artificial Neural Network (ANN) design principles and learning rules inapplicable to SNN. This study presents a theoretical investigation into the impact of time discretization on SNN performance in real-world tasks. We present a network architecture based on the Error Backpropagation Through Spikes model. This approach allows for the transformation of the original system of differential equations into a system of difference equations across multiple time steps. We evaluate our model's performance on the MNIST dataset and identify several phenomena that influence accuracy. Our analysis primarily focuses on gradient dynamics and spectral properties of the voltage signals. These findings contribute to a deeper understanding of SNN behavior and potential optimization strategies.