Research & Papers

NORACL: Neurogenesis for Oracle-free Resource-Adaptive Continual Learning

New algorithm grows neural networks only when needed, matching oracle-sized models with fewer parameters.

Deep Dive

Researchers propose NORACL, a continual learning algorithm that grows neural network capacity on-demand, inspired by neurogenesis. It monitors two complementary signals—representational and plasticity saturation—to decide when to add neurons. Across varying task counts and geometries, NORACL matches or beats oracle-sized static baselines using fewer parameters. It also produces interpretable growth patterns: dissimilar tasks predominantly expand feature-extraction layers, while tasks relying on common features shift growth toward later feature-combination layers.

Key Points
  • NORACL grows neurons only when representational or plasticity saturation signals trigger, avoiding over-provisioning
  • Matches or beats oracle-sized static models across varying task counts while using fewer parameters
  • Interpretable growth: dissimilar tasks expand feature-extraction layers; similar tasks expand combination layers

Why It Matters

Enables truly adaptive AI that dynamically scales for unknown future tasks, reducing wasted compute and improving lifelong learning.