โ๏ธAWS Machine Learning BlogโขStalecollected in 13m
Stateful MCP Clients on Bedrock AgentCore

๐กBuild interactive stateful AI agents on Bedrock with LLM streaming & code examples
โก 30-Second TL;DR
What Changed
Build stateful MCP servers requesting user input during execution
Why It Matters
Enhances Bedrock's agent-building with stateful interactions and real-time feedback. Enables more complex, user-involved AI workflows on AWS infrastructure.
What To Do Next
Deploy a sample stateful MCP server to Amazon Bedrock AgentCore Runtime using the blog's code.
Who should care:Developers & AI Engineers
๐ง Deep Insight
AI-generated analysis for this event.
๐ Enhanced Key Takeaways
- โขThe integration leverages the Model Context Protocol (MCP) to standardize communication between Bedrock AgentCore and external data sources, reducing the need for custom API wrappers.
- โขStateful persistence is achieved through a new session-management layer in AgentCore that caches MCP server state across multiple turns, enabling complex multi-step workflows.
- โขThe implementation introduces a 'Human-in-the-Loop' (HITL) interrupt mechanism that pauses agent execution and holds the session state until an asynchronous callback is received from the client.
๐ Competitor Analysisโธ Show
| Feature | AWS Bedrock AgentCore | LangChain LangGraph | Google Vertex AI Agent Builder |
|---|---|---|---|
| MCP Support | Native/Integrated | Via Community Adapters | Limited/Proprietary |
| State Management | Managed/Serverless | Developer-defined | Managed/Platform-specific |
| Pricing Model | Pay-per-invocation | Open Source/Cloud-hosted | Pay-per-invocation |
| Latency | Low (AWS Backbone) | Variable (Host-dependent) | Low (Google Backbone) |
๐ ๏ธ Technical Deep Dive
- โขUses a persistent WebSocket connection between the AgentCore runtime and the MCP server to maintain session context.
- โขImplements a 'suspend-and-resume' architecture where the agent state is serialized to Amazon DynamoDB when awaiting user input.
- โขSupports bi-directional streaming via MCP's 'notifications' protocol, allowing the server to push progress updates to the client without waiting for a request.
- โขLLM sampling is handled via a dedicated 'sampling' tool definition in the MCP schema, allowing the agent to request the client to perform inference on its behalf.
๐ฎ Future ImplicationsAI analysis grounded in cited sources
AgentCore will become the primary orchestration layer for enterprise RAG applications.
Standardizing on MCP allows enterprises to connect disparate internal data silos to Bedrock agents without re-engineering connectors.
Third-party MCP server marketplaces will emerge within the AWS ecosystem.
The ability to deploy stateful MCP servers directly to AgentCore creates a standardized deployment target for ISVs.
โณ Timeline
2024-11
Anthropic introduces the Model Context Protocol (MCP) as an open standard.
2025-06
AWS announces the preview of Bedrock AgentCore for simplified agent orchestration.
2026-01
Bedrock AgentCore reaches general availability with initial MCP support.
๐ฐ
Weekly AI Recap
Read this week's curated digest of top AI events โ
๐Related Updates
AI-curated news aggregator. All content rights belong to original publishers.
Original source: AWS Machine Learning Blog โ