Research & Papers

A multi-scale information geometry reveals the structure of mutual information in neural populations

Researchers derive a Riemannian metric directly linked to mutual information in neural codes.

Deep Dive

A team of computational neuroscientists—Simone Azeglio, Steeve Laquitaine, Ulisse Ferrari, and Matthew Chalk—has published a paper on arXiv proposing a new mathematical framework to decode how neural populations represent sensory information. Their work, titled "A multi-scale information geometry reveals the structure of mutual information in neural populations," addresses a long-standing problem: existing methods for constructing representational geometries from neural activity yield conflicting conclusions about the neural code. The authors show that a unique Riemannian geometry emerges from first principles when one examines how distances in stimulus space contract as stimulus resolution is lost through coarse-graining. This leads to a multi-scale extension of the Fisher information metric, which captures encoding structure from fine details to coarse global distinctions.

The key innovation is that this geometry is exactly tied to the mutual information encoded by the neural population—directions that contribute more to mutual information are expanded, while poorly encoded directions are contracted. The metric tensor can be estimated efficiently using diffusion models, making the framework practical for large neural populations and high-dimensional stimuli. When applied to visual cortical responses to natural images, the eigenvectors of the metric tensor identify stimulus variations that most contribute to information transmission, yielding interpretable features that are robust to modeling assumptions. This work provides a principled, information-theoretic tool for understanding neural codes, with potential applications in neuroscience and AI interpretability.

Key Points
  • Derives a unique Riemannian metric from first principles via multi-scale coarse-graining of stimulus resolution.
  • Metric is exactly linked to mutual information: expands well-encoded directions, contracts poorly encoded ones.
  • Diffusion models enable practical estimation for large neural populations, demonstrated on visual cortex responses to natural images.

Why It Matters

Unlocks a principled way to interpret neural codes in large populations, bridging neuroscience and information theory.