MCP support in llama.cpp is ready for testing
Your local models can now browse files, call APIs, and act as agents.
Deep Dive
Llama.cpp has merged support for the Model Context Protocol (MCP), a major update after over a month of development. This allows locally run LLMs to connect to external tools and data sources. Key features include a tool-calling agentic loop, a resources browser with file tree view, prompt attachments, and a CORS proxy. The integration, now in testing, transforms simple chat models into capable agents that can interact with your system and the web.
Why It Matters
This bridges the gap between powerful local models and the tool-using capabilities of cloud APIs, enabling sophisticated agent workflows offline.