Research & Papers

Minimum-Action Learning: Energy-Constrained Symbolic Model Selection for Physical Law Identification from Noisy Data

New AI framework recovers Kepler's law from chaotic data using 40% less energy than baseline methods.

Deep Dive

Researcher Martin G. Frasch has introduced Minimum-Action Learning (MAL), a novel framework designed to tackle a core challenge in scientific machine learning: discovering interpretable physical laws from messy, real-world data. The method selects symbolic force laws (like F = G*m1*m2/r^p) from a predefined library by minimizing a custom "Triple-Action" functional. This functional cleverly balances three objectives: accurately reconstructing observed trajectories, enforcing sparsity in the model (to find simple laws), and crucially, ensuring the discovered law conserves energy—a fundamental principle in physics. The real breakthrough enabling this work is a preprocessing technique called "wide-stencil acceleration-matching," which slashes noise variance by a staggering 10,000 times. This transforms an essentially unsolvable problem with a signal-to-noise ratio (SNR) of ~0.02 into a tractable one with an SNR of ~1.6.

In practical tests, MAL demonstrated remarkable performance. On the classic problem of identifying Kepler's law of gravity from noisy planetary motion data, it correctly identified the inverse-square law (finding an exponent of p = 3.01 ± 0.01). While the raw correct-basis selection rate was 40% for Kepler and 90% for a simpler Hooke's law test, an energy-conservation-based diagnostic filter was able to pinpoint the true physical law in 100% of cases at the end of the pipeline. Furthermore, the process was efficient, consuming approximately 0.07 kWh, which represents a 40% reduction in computational energy compared to baselines that only minimize prediction error. The framework establishes a distinct niche compared to other methods like SINDy or Hamiltonian Neural Networks by providing interpretable, energy-constrained symbolic model selection validated against full dynamical rollouts.

Key Points
  • Uses a novel 'Triple-Action' functional combining trajectory fit, sparsity, and energy conservation to select symbolic models.
  • Critical preprocessing step reduces noise variance by 10,000x, making extremely noisy data (SNR ~0.02) learnable.
  • Achieved 100% pipeline-level identification of Kepler's law, using ~0.07 kWh (40% less energy than error-only baselines).

Why It Matters

Enables AI to discover fundamental scientific equations from real-world, imperfect data, accelerating research in physics, engineering, and climate science.