MPCS: Neuroplastic Continual Learning via Multi-Component Plasticity and Topology-Aware EWC
Fourier encoding is critical—removing it drops performance by 30.7 percentage points.
Continual learning systems struggle to learn new tasks without forgetting old ones. A new paper introduces MPCS (Multi-Plasticity Continual System), an architecture that integrates 11 complementary mechanisms to balance plasticity (learning new knowledge) and stability (retaining old knowledge). These include task-driven neurogenesis, Fourier-encoded inputs, EWC regularization, meta-replay, mixed consolidation, hybrid gating, synapse pruning/regeneration, Hebbian updates, task similarity routing, adaptive growth control, and continuous neuron importance tracking.
Evaluated on MEP-BENCH—a 31-task benchmark spanning regression, classification, logic, and mixed domains—MPCS achieved a Normalized Efficiency Score of 94.2, placing it on the Pareto frontier. Ablation studies revealed that Fourier encoding is the single most critical component: removing it drops performance by 30.7 percentage points and fails the MEP gate on 14% of tasks. Surprisingly, global EWC regulation degrades performance (NES = -4.2), while topology-local EWC reduces the penalty but doesn't eliminate it. Removing both EWC and Hebbian updates yields MPCS_EFFICIENT, which improves performance by 0.6 pp at 4.7x lower compute cost (127 vs. 602 minutes), validating the Pareto frontier as a practical model-compression guide.
- MPCS integrates 11 mechanisms including Fourier encoding and meta-replay to balance plasticity and stability.
- Removing Fourier encoding drops performance by 30.7 pp, making it the most critical component.
- MPCS_EFFICIENT (no EWC/Hebbian) improves performance by 0.6 pp at 4.7x lower compute cost (127 vs. 602 min).
Why It Matters
MPCS shows how to optimize continual learning systems, enabling AI that learns continuously without forgetting, at lower cost.