[P] I built an LLM gateway in Rust because I was tired of API failures
Developer builds Rust-based gateway with sub-millisecond overhead and 9 LLM provider support to prevent app downtime.
A developer frustrated with LLM API failures in production has open-sourced Sentinel, a Rust-built gateway designed to solve common deployment headaches. The tool directly addresses issues like application breaks when OpenAI goes down, expensive model usage for simple tasks, lack of spending visibility, and PII leakage. Sentinel provides automatic failover between 9 supported LLM providers, detailed cost tracking, PII redaction to keep sensitive data on-premises, and smart caching to save money. Built for performance with sub-millisecond overhead, it uses SQLite for logging and DashMap for caching, and presents an OpenAI-compatible API, requiring only a base URL change for integration. The project is now seeking community feedback on GitHub.
- Automatic failover between 9 LLM providers (like OpenAI and Anthropic) prevents app downtime during API outages.
- Built in Rust for high performance, adding sub-millisecond overhead and featuring PII redaction to keep sensitive data internal.
- OpenAI-compatible API allows easy integration; just change the base URL to start using the gateway with existing code.
Why It Matters
Provides production resilience and cost control for any business relying on external LLM APIs, preventing costly downtime.