Spike Time Sensitivity in Spiking Neural Networks

Investigating the Effect of Sample Difficulty in Time-to-First-Spike Coded Spiking Neural Networks

Master Thesis (2025)
Author(s)

E. Aydoslu (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

N. Tömen – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

O. Booij – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

Jan van Van Gemert – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

Aurora Micheli – Mentor (TU Delft - Pattern Recognition and Bioinformatics)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
30-06-2025
Awarding Institution
Delft University of Technology
Project
['Master Thesis']
Programme
['Computer Science']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Spiking neural networks (SNNs) with Time-to-First-Spike (TTFS) coding promise rapid, sparse, and energy-efficient inference. However, the impact of sample difficulty on TTFS dynamics remains underexplored. We investigate (i) how input hardness influences first-spike timing and (ii) whether training on hard samples expedites inference. By quantifying difficulty via geometric margins and Gaussian-noise perturbations, and modeling leaky integrate-and-fire dynamics as Gaussian random walks, we derive first-hitting-time predictions. We further show that training-time noise, akin to ridge regularization, reduces weight variance and increases expected spike latencies. Empirical results on a synthetic task, MNIST, NMNIST, and CIFAR-10 with spiking MLPs/CNNs confirm that harder inputs slow inference and noise-trained models trade robustness for latency. Our findings align TTFS behavior with drift-diffusion models and provide a framework for balancing speed and robustness in neuromorphic SNNs.

Files

Thesis_Final_Version.pdf
(pdf | 3.97 Mb)
License info not available
Article_Final_Version.pdf
(pdf | 1.1 Mb)
License info not available