AI Safety

Distinguish between inference scaling and "larger tasks use more compute"

New analysis reveals the real reason AI is getting more expensive to run.

Deep Dive

A viral analysis distinguishes between two reasons AI inference costs are skyrocketing. First, models are tackling larger, more complex tasks that naturally require more compute—similar to humans taking longer for bigger jobs. Second, and more concerning, models may be using disproportionately more compute for the same-sized tasks, squeezing efficiency. This distinction is critical because only the second scenario represents true "inference scaling," which could make advanced AI economically unsustainable if costs outpace utility gains.

Why It Matters

If AI progress relies on unsustainable cost increases, the most powerful models could become too expensive for practical, widespread use.