Research & Papers

Amortized Filtering and Smoothing with Conditional Normalizing Flows

New framework encodes observation history into fixed-size summaries, enabling accurate filtering and smoothing for complex systems.

Deep Dive

A team of researchers including Tiangang Cui and Xiaodong Feng has introduced AFSF (Amortized Filtering and Smoothing with Conditional Normalizing Flows), a novel framework addressing the fundamental challenge of Bayesian filtering and smoothing in high-dimensional nonlinear dynamical systems. The core innovation lies in using a recurrent encoder to map observation histories into fixed-dimensional summary statistics, regardless of time series length. Conditioned on these shared representations, the system learns both a forward flow for filtering distributions and a backward flow for transition kernels, creating a unified approach to state estimation.

By learning the underlying temporal evolution structure, AFSF supports extrapolation beyond training horizons and induces implicit regularization across latent state trajectories, improving smoothing accuracy. The framework also includes a flow-based particle filtering variant that enables effective sample size (ESS) diagnostics when explicit model factors are available. Numerical experiments across 43 pages of documentation demonstrate that AFSF provides accurate approximations of both filtering distributions and smoothing paths, making it particularly valuable for complex scientific and engineering applications where traditional methods struggle with dimensionality.

The researchers' approach represents a significant advancement in sequential Bayesian inference, offering a scalable solution to problems in fields ranging from climate modeling to financial forecasting. By combining the efficiency of amortized inference with the flexibility of normalizing flows, AFSF opens new possibilities for real-time state estimation in systems with thousands of variables, potentially transforming how researchers and engineers approach uncertainty quantification in dynamic environments.

Key Points
  • Uses conditional normalizing flows to encode observation histories into fixed-dimensional statistics independent of time series length
  • Learns both forward filtering distributions and backward transition kernels through shared summary representations
  • Supports extrapolation beyond training horizons and includes flow-based particle filtering for ESS diagnostics

Why It Matters

Enables accurate state estimation for complex systems like climate models and financial markets where traditional methods fail at scale.