Memory Mechanisms in Spiking Neural Networks

More Info
expand_more

Abstract

Neuromorphic sensors, like for example event cameras, detect incremental changes in the sensed quantity and communicate these via a stream of events. Desired properties of these signals such as high temporal resolution and asynchrony are not always fully exploited by algorithms that process these signals. Spiking neural networks (SNNs) have emerged as the algorithms that promise to maximally attain these characteristics and are likely the key to achieving a fully neuromorphic computing pipeline. But, this means that if the SNN is to take full advantage, the event stream must be sent directly and unaltered to the SNN, which in turn implies that all temporal integration should occur inside the SNN. Therefore, it is interesting to investigate the mechanisms that achieve this. This thesis does so through evaluating and comparing the performance of different memory mechanisms in SNNs found in the literature, as well as through an in depth analysis of the inner workings of these mechanisms. The mechanisms include spiking neural dynamics (leaks and thresholds), explicit recurrent connections, and propagation delays. We demonstrate our concepts on two small scale generated 1D moving pixel tasks in preliminary experiments first. After that, we extend our research to compare the memory mechanisms on a real-world neuromorphic vision processing task, in which the networks regress angular velocity given event based input. We find that both explicit recurrency and delays improve the prediction accuracy of the SNN, compared to having just spiking neuronal dynamics. Analysis of the inner workings of the networks shows that the threshold and reset mechanism of spiking neurons play an important role in allowing longer neuron timescales (lower membrane leak). Forgetting (at the right time) turns out to play an important role in memory. Additionally, it becomes apparent that optimizing an SNN with explicit recurrent connections or learnable delays does not lead to the formation of robust spiking neuronal dynamics. In fact, spiking neuronal dynamics are largely ignored, as after optimization virtually no input current is integrated onto the membrane potential in these cases. Instead, we consistently find that a recurrent SNN prefers to build a state solely with the explicit recurrent connections, while an SNN with delays prefers to just use the delays. Therefore, our SNNs with explicit recurrent connections and delays are in fact better described as binary activated RNNs and ANNs, respectively.

Files