Models & Releases

Codex limit issue

Users see 100% usage drop to 37% after just three messages, sparking outage concerns.

Deep Dive

A significant bug is affecting ChatGPT Plus subscribers, causing their usage limits to drain precipitously fast without corresponding heavy usage. Users on platforms like Reddit report that after asking a single question, their available limit can drop from 100% to 80% in under a minute. One detailed case shows a user's limit falling to a mere 37% after sending just three messages, despite being on a standard Plus plan without employing memory-consuming features like MCPs (Model Context Protocols), custom skills, or extra plugins. This pattern points to a miscalculation or error on OpenAI's backend systems, incorrectly attributing high token usage to normal queries.

The issue has sparked concern within the user community about the reliability of the service they pay for, with many suspecting a silent, unannounced outage or a flawed new rate-limiting algorithm. The bug appears to be systemic rather than isolated, as multiple independent reports describe similar rapid depletion. For professionals relying on ChatGPT for coding, writing, or research, this effectively renders the service unusable for extended tasks, as their allocated usage evaporates in minutes instead of hours. OpenAI has not yet issued an official statement acknowledging the problem, leaving users to speculate and report the issue informally while their paid access is severely hampered.

Key Points
  • User's 100% usage limit dropped to 37% after sending only three messages.
  • The bug occurs on standard ChatGPT Plus plans without memory-intensive plugins or MCPs enabled.
  • Multiple reports suggest a backend system error at OpenAI is incorrectly calculating token usage.

Why It Matters

This bug cripples a paid service, preventing professionals from completing work and undermining trust in subscription reliability.