State Space Models Naturally Produce Time Cell and Oscillatory Behaviors and Scale to Abstract Cognitive Functions
New research reveals AI models may hold the key to understanding human cognition.
A new paper proposes that State Space Models (SSMs), a class of deep learning architectures, naturally produce neural behaviors like 'time cells' and oscillations before any training. The research shows these features emerge from optimal history compression and rotational dynamics, mirroring biological circuits. Learning then fine-tunes these pre-configured modes. The model scales to abstract cognitive functions like event counting, offering a computationally tractable framework to bridge neuroscience and AI.
Why It Matters
This provides a powerful new framework for understanding how biological brains produce complex, abstract thought from simple dynamics.