PSA: litellm PyPI package was compromised — if you use DSPy, Cursor, or any LLM project, check your dependencies
A malicious version of the 97M/month download package exfiltrated SSH keys, cloud credentials, and API secrets.
A critical supply chain attack has struck the core of the AI development ecosystem. The LiteLLM Python package, a crucial tool with 97 million monthly downloads that standardizes API calls to models from OpenAI (GPT-4), Anthropic (Claude 3), and Cohere, was compromised on the PyPI repository. For approximately one hour, a malicious version (1.82.8) was available. Simply running `pip install litellm`—or installing any project that depends on it, like the popular DSPy framework or the Cursor AI coding assistant—triggered the payload. This malware was designed to stealthily exfiltrate a devastatingly broad range of sensitive data from the victim's machine.
The stolen data includes the complete keys to a developer's digital kingdom: SSH keys, credentials for AWS, Google Cloud, and Azure, Kubernetes configuration files, Git credentials, shell history, and all environment variables (which typically contain API keys and other secrets). It also targeted crypto wallets, SSL private keys, and CI/CD secrets. The attack was discovered by chance when a user's machine crashed. AI pioneer Andrej Karpathy described it as "the scariest thing imaginable in modern software," highlighting the catastrophic trust collapse in open-source dependencies. While the malicious package has been removed from PyPI, the one-hour window was enough to cause significant damage. The security community urges all Python developers, especially those using AI/LLM tools, to audit their installations from yesterday and immediately rotate all potentially exposed credentials.
- The LiteLLM package (97M monthly downloads) had a malicious version (1.82.8) on PyPI for one hour.
- The malware exfiltrated SSH keys, cloud credentials (AWS/GCP/Azure), environment variables, and crypto wallet data.
- Impacts users of dependent AI tools like DSPy and Cursor; immediate credential rotation is critical.
Why It Matters
This attack undermines trust in core open-source infrastructure and exposes API keys and cloud credentials for millions of AI developers.