1.12.1
The latest update enables direct integration with OpenRouter, DeepSeek, and Ollama, bypassing LiteLLM.
CrewAI Inc., the company behind the popular open-source framework for orchestrating role-playing AI agents, has launched version 1.12.1. This significant update focuses on enhancing developer flexibility and system robustness. A headline feature is the implementation of native, direct integration for OpenAI-compatible API providers. This allows developers to seamlessly connect their CrewAI agents to services like OpenRouter, DeepSeek, Ollama, vLLM, Cerebras, and Alibaba's Dashscope without relying on the LiteLLM abstraction layer, offering more control and potentially lower latency.
The release also introduces major upgrades to the framework's memory and tooling systems. A new Qdrant Edge storage backend provides a lightweight, efficient option for the agent memory system, crucial for persistent, long-running workflows. Furthermore, the update implements automatic `root_scope` for hierarchical memory isolation, ensuring agents operating in different branches of a task don't corrupt each other's context. Alongside these, the update adds a new `agent skills` implementation, Arabic language support for all documentation, and a comprehensive set of bug fixes targeting the Human-in-the-Loop (HITL) flow and type safety across the entire codebase.
- Adds native, direct integration for OpenAI-compatible providers (OpenRouter, DeepSeek, Ollama, vLLM), reducing dependency on LiteLLM.
- Introduces Qdrant Edge backend for agent memory and implements automatic hierarchical memory isolation for complex workflows.
- Includes major bug fixes for the HITL system, full Arabic documentation translation, and enhanced type safety across the package.
Why It Matters
This update significantly lowers the barrier for building sophisticated, multi-provider AI agent systems, making CrewAI more production-ready for enterprise workflows.