Introducing granular cost attribution for Amazon Bedrock
Amazon Bedrock now automatically tracks AI inference costs down to individual IAM users, roles, and federated identities.
AWS has introduced granular cost attribution for its Amazon Bedrock managed AI service, addressing a critical pain point for enterprises scaling generative AI. As AI inference becomes a significant portion of cloud spend, organizations need visibility into which teams, projects, and individuals are driving costs. The new feature automatically tracks every Bedrock API call—whether using models like Claude 4.6 Sonnet, Claude 4.6 Opus, or others—and attributes expenses to the specific IAM principal (user, application role, or federated identity from providers like Okta) that made the request. This data flows directly into AWS Cost and Usage Reports (CUR 2.0) without requiring changes to existing workflows.
For deeper analysis, administrators can add cost allocation tags to IAM principals, enabling aggregation by team, project, or custom dimensions in AWS Cost Explorer. The system supports four common access patterns: developers with IAM users/API keys, applications with IAM roles, users authenticating through identity providers, and LLM gateway scenarios. For gateway setups, per-user attribution requires additional session management, but other scenarios provide immediate user-level visibility. This granular tracking enables precise chargebacks, budget forecasting, and identification of optimization opportunities as AI adoption grows across departments.
- Automatically attributes Bedrock inference costs to specific IAM users, roles, and federated identities
- Costs flow to AWS Billing and CUR 2.0 with no workflow changes required
- Optional tags enable aggregation by team/project in Cost Explorer for visual analysis
Why It Matters
Enables enterprises to track, allocate, and optimize growing AI inference costs as adoption scales across teams and projects.