Direct From Darwin: Deriving Advanced Optimizers From Evolutionary First Principles
Derives SGD, Adam, and Newton's method from evolutionary first principles
A new paper by Daniel Grimmer, titled 'Direct From Darwin: Deriving Advanced Optimizers From Evolutionary First Principles,' proposes a unified framework for optimization algorithms grounded in evolutionary biology. Grimmer introduces Darwinian Lineage Simulations (DLS) to demonstrate that Fisher's deterministic view of evolution and Wright's stochastic perspective are formally equivalent when the population is properly partitioned. This unification requires careful bookkeeping in the form of structured noise, which Grimmer calls the DLS noise relation. Any algorithm that satisfies this relation can serve as a scientifically valid in silico simulation of Darwinian evolution.
Crucially, Grimmer proves that many battle-tested optimizers—including Stochastic Gradient Descent, Natural Gradient Descent, and Damped Newton's method—already comply with evolutionary dynamics when augmented with DLS noise. Even the Adam optimizer, widely used in deep learning, can be brought into full evolutionary compliance with a minor mathematical adjustment. This work bridges evolutionary computation and modern optimization, offering new theoretical insights and practical tools for researchers. The paper, submitted to Evolutionary Computation in May 2026, includes 38 pages and 5 figures.
- Grimmer's Darwinian Lineage Simulations (DLS) prove Fisher's and Wright's evolutionary views are equivalent under proper population partitioning.
- Structured DLS noise turns SGD, Natural Gradient Descent, and Damped Newton's method into faithful evolution simulations.
- Adam optimizer becomes evolutionarily compliant with a minor mathematical surgery, integrating deep learning with evolutionary theory.
Why It Matters
Unifies optimization and evolutionary biology, potentially leading to more robust AI training algorithms inspired by natural selection.