Research & Papers

Primitive Recursion without Composition: Dynamical Characterizations, from Neural Networks to Polynomial ODEs

Recurrent neural networks and polynomial ODEs are equivalent to primitive recursion without composition.

Deep Dive

In a new paper on arXiv (2604.24356), Olivier Bournez proves that three seemingly distinct dynamical models—recurrent neural networks with ReLU activation, polynomial ordinary differential equations (ODEs), and discrete polynomial maps—each provide equivalent characterizations of primitive recursion. The key insight is that these models compute not by composing subroutines, but by shaping the trajectory of a dynamical system through clocks, phase selectors, and error correction built into the dynamics.

The proof works by first compiling any primitive recursive function into bounded iteration of a single threshold-affine normal form. This normal form is then interpreted as a ReLU computation and as a polynomial ODE. The equivalences reveal a structural asymmetry: no fixed polynomial map can uniformly round to the nearest integer or realize exact phase selection—operations that polynomial ODEs perform robustly via continuous-time flow. Each formalism compensates for a limitation the others lack: the ReLU gate provides exact branching, continuous time provides autonomous rounding and control, and the step-size parameter recovers both at the cost of discretization precision. This opens dynamical characterizations of subrecursive hierarchies and complexity classes by restricting time bounds, polynomial degrees, or discretization resources within one framework.

Key Points
  • Recurrent ReLU networks, polynomial ODEs, and discrete polynomial maps each characterize primitive recursion without composition.
  • The proof compiles any primitive recursive function into bounded iteration of a single threshold-affine normal form.
  • Each model compensates for others' limitations: ReLU gives exact branching, continuous time provides autonomous rounding, and step-size parameters recover both at a discretization cost.

Why It Matters

This work bridges neural networks, dynamical systems, and recursion theory, offering a unified framework to analyze computation in continuous models.