Research & Papers

Quantifying information stored in synaptic connections rather than in firing activities of neural networks

Synaptic connections may hold more memory than neural firing ever could...

Deep Dive

A new theoretical framework from researchers Xinhao Fan and Shreesh P. Mysore, published in Neural Computation, tackles a longstanding gap in neuroscience: quantifying information stored in synaptic connections rather than neural firing activity. While decades of work have focused on how neurons encode information through spike patterns, the actual memory storage in synaptic weights—the strengths of connections between neurons—lacked a formal information-theoretic measure. The authors address this using densely connected Hebbian networks performing autoassociative memory tasks, modeling stored data patterns as log-normal distributions.

Their framework derives analytical approximations for Shannon mutual information between data and singletons, pairs, and arbitrary n-tuples of synaptic connections. A key finding is synergistic interactions among synapses: the information encoded jointly by all synapses exceeds the sum of individual contributions. This supports distributed coding principles and formalizes heterogeneity in synaptic information encoding. The work has implications for both biological neural networks and artificial neural networks, offering a new lens for understanding memory storage and learning dynamics.

Key Points
  • Framework quantifies information stored in synaptic connections using Shannon mutual information
  • Synergistic interactions found where joint synaptic information exceeds sum of individual parts
  • Applies to both biological and artificial neural networks for understanding memory storage

Why It Matters

Bridges a critical gap in neuroscience, offering a new tool to understand memory storage in both biological and AI systems.