Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Chapter 18: The Future of MCP

Where We’re Going

MCP is moving fast. The specification has gone through multiple revisions since its November 2024 launch, each one refining the protocol based on real-world usage. The trajectory is clear: MCP is evolving from a local-first protocol for connecting desktop apps to tools into a robust, internet-scale standard for AI-to-world integration.

This chapter covers the announced direction, active proposals, and reasonable extrapolations about where MCP is heading.

The Stateless Future

The biggest architectural shift on the horizon is the move from stateful to stateless. Currently, MCP’s Streamable HTTP transport supports sessions with initialization handshakes. The future vision, discussed publicly by the MCP team, is a protocol where each request is self-contained.

What This Means

Today:

1. Client connects
2. Client sends initialize request
3. Server returns capabilities
4. Client sends initialized notification
5. Session established
6. Client makes requests within the session

Future:

1. Client sends request (with capabilities inline)
2. Server responds
3. Done.

No handshake. No session. Each request carries everything the server needs to process it.

Why It Matters

Stateless protocols are dramatically easier to scale:

  • No sticky sessions — Any server instance can handle any request
  • Serverless-friendly — Each request can be a Lambda invocation
  • No session storage — No Redis, no distributed session state
  • Simpler load balancing — Round-robin works fine
  • Better fault tolerance — Server crashes don’t lose session state

How It Works

Instead of negotiating capabilities once during initialization, the client would include relevant information with each request. Session state moves from the transport layer to the application layer, using something like HTTP cookies.

The MCP team is exploring Spec Enhancement Proposals (SEPs) to formalize this. The target is the next major specification release.

Server Cards

A proposed /.well-known/mcp.json endpoint would let clients discover server capabilities before connecting:

{
  "name": "acme-data-server",
  "version": "2.0.0",
  "description": "Access to Acme Corp's data APIs",
  "authentication": {
    "type": "oauth2",
    "authorizationUrl": "https://auth.acme.com/authorize",
    "tokenUrl": "https://auth.acme.com/token",
    "scopes": ["read:data", "write:data"]
  },
  "capabilities": {
    "tools": true,
    "resources": true,
    "prompts": false
  },
  "rateLimit": {
    "requestsPerMinute": 100
  },
  "contact": "mcp-support@acme.com"
}

This enables:

  • Auto-discovery — Clients can learn about a server before connecting
  • Configuration generation — Hosts can auto-configure based on the card
  • Security validation — Verify the server’s identity and auth requirements
  • Catalog listing — Registries can index servers automatically

Notification and Subscription Evolution

The current notification system uses a persistent SSE stream from server to client. The future direction replaces this with explicit subscription mechanisms:

  • Clients open dedicated streams for specific subscriptions
  • Support for concurrent subscriptions to different resources
  • TTL values and ETags enable intelligent caching
  • Clients can rebuild state from subscriptions alone (no session dependency)

This makes notifications compatible with the stateless vision while keeping the reactivity that makes MCP powerful.

Sampling and Elicitation Redesign

The move to stateless requires rethinking how bidirectional features work. Currently, sampling and elicitation assume a persistent connection where the server sends a request and waits for a response.

The proposed redesign uses a request-response pattern:

  1. Server returns a “pending” result that includes the sampling/elicitation request
  2. Client processes it (sends to LLM, shows to user)
  3. Client sends a new request with both the original request and the response

This eliminates long-lived server state while preserving the functionality.

Routing and Infrastructure

Currently, all MCP messages go through a single endpoint (/mcp). Infrastructure can’t route or filter without parsing JSON-RPC payloads. The future may expose routing information via HTTP paths and headers:

POST /mcp/tools/call HTTP/1.1
X-MCP-Method: tools/call
X-MCP-Tool: search_documents

This would let standard HTTP infrastructure (load balancers, API gateways, WAFs) make routing decisions without understanding JSON-RPC.

The Agentic AI Foundation

In December 2025, Anthropic donated MCP to the Agentic AI Foundation (AAIF), a new entity under the Linux Foundation. Platinum members include Amazon, Anthropic, Block, Bloomberg, Cloudflare, Google, Microsoft, and OpenAI.

This is a significant milestone. MCP is no longer one company’s open-source project—it’s an industry standard governed by a multi-stakeholder foundation. This means:

  • Vendor neutrality — No single company controls the spec
  • Long-term stability — The foundation ensures continuity regardless of any one company’s fate
  • Broader adoption — Competitors co-governing a standard signals serious commitment
  • Faster evolution — More contributors, more use cases, more feedback

The AAIF governance model will shape how MCP evolves. Expect a more formal proposal process (SEPs are already heading in this direction), more rigorous backwards-compatibility requirements, and broader community input.

The Agent Protocol Convergence

MCP isn’t the only protocol in the AI agent space. There’s Agent Protocol, A2A (Agent-to-Agent) by Google, and various framework-specific interfaces. The trend is toward convergence or interoperability.

MCP’s advantages in this landscape:

  • First-mover with broad adoption
  • Backed by Anthropic with resources to evolve the spec
  • Adopted by competitors (OpenAI, Google are integrating MCP support)
  • Simple enough to implement alongside other protocols

The likely future isn’t “one protocol to rule them all” but rather MCP as the dominant tool/resource integration protocol, potentially bridged to agent-to-agent protocols for multi-agent scenarios.

Expanded Content Types

MCP currently supports text, images, audio, and embedded resources as content types. Future expansions may include:

  • Video — For screen recording, visual demonstrations
  • Structured data tables — Native tabular data format
  • Interactive content — Forms, widgets, rich UI elements
  • Streaming content — For real-time data feeds

Security Enhancements

Security is an active area of development:

Capability-Based Access Control

More granular permissions for what tools can do, potentially including:

  • File access scopes
  • Network access restrictions
  • Resource consumption limits

Signed Server Cards

Cryptographically signed server metadata for trust verification.

Audit Protocols

Standardized audit logging formats for compliance-sensitive environments.

Sandboxing Standards

Guidelines for running MCP servers in sandboxed environments with standardized capability restrictions.

The Broader Vision

Step back and look at the big picture. MCP is part of a broader shift in how software is built:

Before AI: Humans interact with software through UIs. APIs connect software to software. Integration is explicit and coded.

With AI (today): AI models interact with software through MCP tools. Natural language replaces some of the explicit API coding. But the integration is still mostly configured by humans.

With AI (future): AI agents discover and compose MCP servers dynamically. An agent that needs weather data finds a weather server, negotiates capabilities, authenticates, and uses it—all without human configuration. The “USB-C moment” extends to automatic plug-and-play.

This future requires:

  • Server discovery (server cards, registries)
  • Automatic authentication (standardized auth flows)
  • Capability matching (semantic understanding of what servers offer)
  • Trust establishment (reputation systems, signed cards, audits)

MCP is building toward this, one spec revision at a time.

What You Can Do Today

While the future is exciting, there’s plenty of value in MCP today:

  1. Build servers for your tools and APIs. The ecosystem needs them.
  2. Build clients for your applications. MCP-enabled apps are more capable.
  3. Contribute to the spec. The MCP project accepts contributions via GitHub.
  4. Share your servers. Publish to npm/PyPI, add to registries, write documentation.
  5. Give feedback. Real-world usage drives spec evolution. File issues, join discussions.

Timeline

The story so far, and what’s coming:

  • November 2024: MCP released as open standard, protocol revision 2024-11-05
  • March 2025: OpenAI adopts MCP. Google DeepMind confirms Gemini support. Protocol revision 2025-03-26 introduces Streamable HTTP
  • June 2025: Protocol revision 2025-06-18
  • November 2025: Protocol revision 2025-11-25 adds tasks (experimental), elicitation URL mode, server icons, tool output schemas, structured content
  • December 2025: Anthropic donates MCP to the Agentic AI Foundation under the Linux Foundation
  • Q1 2026: Spec Enhancement Proposals (SEPs) for stateless protocol being finalized
  • Mid 2026: Next major specification release (tentative)
  • Ongoing: SDK updates, new language support, ecosystem growth

The pace of development is fast, but the core protocol is stable. Servers you build today will work tomorrow. The primitives (tools, resources, prompts) aren’t going away—they’re the foundation. What’s changing is the transport and session layer, and those changes are backward-compatible.

Final Thoughts

MCP solved a real problem: the N-times-M integration nightmare that plagued every team building AI applications. In its place, it offers a simple, elegant protocol that turns that nightmare into a vibrant ecosystem.

The protocol is young but growing fast. The specification is evolving but stable where it matters. The ecosystem is early but already useful. And the community—from Anthropic’s core team to individual developers building niche servers—is building something genuinely new.

If you’ve read this far, you know enough to build, deploy, and operate MCP servers and clients. You understand the architecture, the protocol, the primitives, the SDKs, the security model, and the ecosystem.

Now go build something.

The tools are waiting.