SiLIF: Structured State Space Model Dynamics and Parametrization for Spiking Neural Networks
New spiking neuron architecture beats state space models on speech recognition using half the compute.
A team of researchers has published a breakthrough paper on arXiv introducing SiLIF (SSM-inspired Leaky Integrate-and-Fire), a new architecture that could make spiking neural networks (SNNs) a practical alternative to standard deep learning models. The work, led by Maxime Fabre, Lyubov Dudchenko, and Emre Neftci, directly addresses the core instability and scalability problems that have historically limited SNNs.
The researchers created two specific SiLIF neuron models. The first extends a two-state neuron with a learnable discretization timestep and a logarithmic reparametrization to stabilize training. The second, more advanced model incorporates the initialization scheme and structure of complex-state SSMs, which enables the network to learn oscillatory dynamics. On benchmarks, both SiLIF models set new state-of-the-art performance records for spiking neuron models on event-based and raw-audio speech recognition datasets.
The most significant finding is the efficiency gain. The paper demonstrates that SiLIF models achieve a favorable performance-efficiency trade-off compared to modern state space models like Mamba. In some cases, the SiLIF architecture even surpassed the performance of SSMs while using only half the computational cost. This efficiency is achieved in part through the biologically-inspired use of synaptic delays within the network's architecture. This positions SNNs, long touted for their energy efficiency on neuromorphic hardware, as now being competitively performant on standard tasks.
- SiLIF models achieve new SOTA for spiking neural networks on speech recognition tasks.
- The architecture uses half the computational cost of State Space Models (SSMs) while matching or exceeding performance.
- Introduces two models: one with learnable timestep & reparametrization, and another with complex-state SSM structure for oscillations.
Why It Matters
Bridges the performance gap between energy-efficient spiking neural networks and mainstream AI models, enabling greener, brain-inspired computing.