Symmetry-Protected Lyapunov Neutral Modes in Equivariant Recurrent Networks
New theorem proves symmetry protects neutral directions in RNNs, enabling infinite memory horizons.
A new paper from Hanson Hanxuan Mo provides a rigorous symmetry-based account of when recurrent networks can naturally preserve continuous variables over long horizons without explicit tuning. The key result: for any finite-dimensional autonomous C^1 vector field equivariant under a Lie group G, any compact invariant set carrying a uniformly nondegenerate group-orbit bundle with stabilizer type H must have at least dim(G/H) zero Lyapunov exponents tangent to the group orbit. These symmetry-protected neutral modes come from exact equivariance and orbit geometry, not from hand-tuned parameters.
The practical validation is striking: the author trains an exactly equivariant recurrent cell on an S^1 path integration task (predicting position from velocity inputs) across six seeds. The learned cell preserves step equivariance to 3.2×10⁻⁸, has a near-zero group-tangent exponent under zero-input autonomous restriction, and significantly improves prediction horizon, speed, and restricted-phase generalization compared to GRU, LSTM, and orthogonal-RNN baselines. When symmetry is explicitly broken, the protected direction acquires a pseudo-gap that predicts finite memory lifetime. This provides a theoretical foundation for building recurrent architectures that can store phase, position, or other continuous variables indefinitely—critical for robotics, navigation, and sensorimotor integration.
- Proven theorem: any equivariant recurrent network under a Lie group G guarantees at least dim(G/H) zero Lyapunov exponents on compact invariant sets, ensuring neutral storage directions.
- Trained exactly equivariant recurrent cell on S^1 path integration achieves step equivariance error of 3.2×10⁻⁸ and outperforms GRU, LSTM, and orthogonal-RNN baselines on horizon and generalization.
- Breaking symmetry introduces a pseudo-gap in the formerly protected direction, with the gap size predicting finite memory lifetime—enabling controlled forgetfulness.
Why It Matters
Enables principled design of recurrent networks with provable long-term memory for continuous variables, critical for robotics and navigation.