AI Safety

Compute Curse

A viral essay argues AI's obsession with compute scaling has created a 'Dutch disease' for software engineering.

Deep Dive

A viral essay titled 'Compute Curse' by Ihor Kendiukhov, published on the AI philosophy forum LessWrong, presents a provocative analogy: the exponential growth of computational power has created a 'resource curse' for the tech industry, mirroring the economic 'Dutch disease' seen in oil-rich nations. The core argument is that the reliable, fundable returns from simply scaling compute—evident in the race to train ever-larger models like GPT-4o and Claude 3—have systematically drawn capital, talent, and research focus away from alternative paths. These neglected paths might include more efficient algorithms, novel architectures, or careful software engineering, which are harder to evaluate and slower to produce results but could be more consequential long-term.

The essay traces this dynamic back decades, suggesting that the expectation of cheaper, faster hardware (Moore's Law) incentivized shipping 'bloated software' and letting future hardware compensate. This degraded the craft of engineering, creating a technological 'monoculture' dependent on compute abundance. The consequences are visible across the stack: from web apps requiring gigabytes of RAM to render text, to Electron-based desktop apps bundling entire browsers, to sprawling microservice architectures. The theory posits that phenomena like 'enshittification' and the AI industry's singular focus on scaling (the 'Bitter Lesson') are symptoms of this deeper structural 'Compute Curse,' where the easiest path forward crowds out potentially better ones.

Key Points
  • The essay draws a direct analogy between AI's compute scaling and the economic 'resource curse,' where booming sectors like oil crowd out other industries.
  • It argues the reliable returns from scaling models (e.g., bigger training runs for GPT-4) have systematically starved research into alternative, potentially more efficient AI methods.
  • The dynamic extends beyond AI, degrading general software engineering for decades and leading to bloated applications and architectures dependent on abundant compute.

Why It Matters

Challenges the foundational assumption that more compute is always the right path, urging a rethink of research priorities and engineering values.