Magic Is Hidden Control of Energy
A viral LessWrong essay argues true intelligence is the precise, information-guided control of energy, not just raw power.
Aviad Rozenhek's essay 'Magic Is Hidden Control of Energy,' published on LessWrong, has gone viral by offering a profound reframing of intelligence and technological advancement. The piece starts by dissecting Richard Feynman's famous complaint that using 'energy' as a blanket explanation is as meaningless as saying 'Wakalixes.' Rozenhek agrees but takes it further, arguing this critique reveals energy's true depth. He posits that energy is reality's 'mana'—the universal, spendable substrate behind all physical transformation and change. The critical insight is that intelligence isn't about possessing energy but about controlling it with precision through information, distinguishing a directed agent from an undirected force like a wildfire.
Rozenhek builds on Eliezer Yudkowsky's concept of 'gears-level' understanding and Duncan Sabien's 'Gears in Understanding.' The essay contends that to move beyond 'mysterious answers,' we must model the specific mechanisms of energy control: where it's stored, how it's transmitted, and what informational structures guide its release. This framework connects deep physical principles to discussions of Artificial General Intelligence (AGI), suggesting that building or recognizing true AGI involves creating systems capable of this sophisticated, adaptive energy governance. The piece resonates because it provides a concrete, mechanistic lens for evaluating intelligence, whether biological or artificial, cutting through vague, spiritualized descriptions.
- Reframes intelligence as 'adaptive control of energy through information,' not raw computational power or data.
- Argues that 'energy' is a shallow, 'Wakalixes'-like explanation without detailing storage, transmission, and control mechanisms.
- Connects physics to AGI development, suggesting advanced technology's 'magic' is precise, information-guided energy manipulation.
Why It Matters
Provides a concrete, physics-based framework for evaluating AI progress and defining true intelligence, moving beyond vague benchmarks.