Research & Papers

Online Generalised Predictive Coding

Brain-inspired algorithm handles chaotic dynamics even with mismatched models.

Deep Dive

A research team led by Mehran H. Z. Bazargani has introduced Online Dynamic Expectation Maximization (ODEM), an extension of generalised filtering that enables real-time data assimilation. Generalised filtering, known under names like variational Kalman-Bucy filtering and predictive coding, jointly infers hidden states, learns model parameters, and estimates uncertainty. ODEM specialises this framework for online applications by separating temporal scales: fast Bayesian belief updating for dynamic states, and slow parameter and precision updates. In numerical experiments with a nonlinear chaotic generative model, ODEM successfully tracked latent states even when the internal model's dynamics differed fundamentally from the true process.

Framed from a neuro-mimetic predictive coding perspective, ODEM offers a biologically inspired solution for continuous inference, learning, and uncertainty estimation in dynamic environments. This capability is critical for applications like autonomous systems, robotic control, and brain-computer interfaces, where models must adapt quickly to changing conditions without full batch retraining. The paper (arXiv:2605.02675) spans 45 pages with 17 figures, detailing the variational principles and procedures underlying the algorithm.

Key Points
  • ODEM achieves triple estimation: latent state inference, parameter learning, and uncertainty quantification in a single online framework.
  • It separates fast belief updating (hidden states) from slow parameter/precision updates, enabling real-time adaptation.
  • Numerical tests on a chaotic nonlinear generative model showed ODEM tracked states even when the assumed model dynamics were fundamentally different.

Why It Matters

Enables real-time adaptive AI systems that learn and infer like the brain, ideal for robotics and autonomous agents.