Developer Tools

Run custom MCP proxies serverless on Amazon Bedrock AgentCore Runtime

Deploy custom MCP proxies serverless with governance, controls, and observability...

Deep Dive

AWS has introduced a serverless MCP (Model Context Protocol) proxy on Amazon Bedrock AgentCore Runtime, allowing organizations to embed custom governance, security, and observability controls into AI agent-tool interactions. The proxy acts as an intermediary between MCP clients and upstream MCP servers, applying programmable logic such as input sanitization, data redaction, and audit trail generation at the protocol layer. It runs on AgentCore Runtime's fully managed serverless infrastructure, which offers automatic scaling, built-in observability through Amazon CloudWatch and OpenTelemetry, and authentication via AgentCore Identity. The proxy can be connected to Amazon Bedrock AgentCore Gateway for centralized governance, including semantic tool discovery, managed credentials, and policy enforcement across MCP servers, Lambda functions, and SaaS integrations. For organizations with existing custom MCP filtering logic tied to internal libraries or on-premises compliance systems, this pattern allows reuse without refactoring into Lambda functions. The solution supports any MCP-compatible upstream server, including those on AgentCore Runtime, self-hosted, or third-party services, and is deployable via an open-source GitHub implementation with automated scripts.

Key Points
  • Serverless MCP proxy on Amazon Bedrock AgentCore Runtime adds programmable controls for AI agent-tool interactions
  • Supports Lambda interceptors for validation, transformation, and filtering on every tool invocation
  • Integrates with AgentCore Gateway for centralized governance, credential management, and policy enforcement

Why It Matters

Enables secure, governed AI agent deployments with reusable custom logic and serverless scalability.