Beta
ConceptsMCPReading · ~3 min · 95 words deep

MCP

An open protocol from Anthropic that standardizes how AI models talk to external tools, data sources, and services.

TL;DR

An open protocol from Anthropic that standardizes how AI models talk to external tools, data sources, and services.

Level 1

MCP defines a client-server architecture. The AI application (Claude Desktop, Cursor, a custom agent) is the client. Tool providers (databases, APIs, file systems, dev tools) are servers. MCP lets any client talk to any server without custom integration code. As of 2026, the MCP registry lists 4,000+ servers covering GitHub, Slack, filesystem, web search, Kubernetes, every major SaaS, and more.

Level 2

MCP uses JSON-RPC over stdio, SSE, or WebSocket. Servers expose three primitives: resources (readable state), tools (callable functions), and prompts (predefined templates). Clients connect to servers at startup, list capabilities, and surface them to the model. The model decides when to invoke tools; the client executes and returns results. MCP solves the N×M integration problem: N AI applications × M tools collapse to N + M MCP implementations. The protocol is spec-first and open; Anthropic published the SDK in multiple languages. Cursor, Cline, Claude Desktop, Zed, and most AI coding agents now ship MCP support.

Level 3

MCP specification v1 defines the message shape (JSON-RPC 2.0 envelope), the capability negotiation handshake, and authorization patterns. Servers run as separate processes for isolation (stdio) or as remote services (HTTP+SSE or WebSocket). The registry at registry.modelcontextprotocol.io lists 4,300+ servers. Security concerns: prompt injection through tool descriptions, unintended tool invocation, cascading data exfiltration through chained tool calls. Mitigations: explicit user approval for each tool invocation (Claude Desktop's confirmation dialog), allowlists, sandboxed execution environments. MCP is gaining competitive alternatives (OpenAI Functions, LangChain Tools) but its open-protocol status makes it the interoperability winner.

The takeaway for you
If you are a
Researcher
  • ·Open JSON-RPC protocol · three primitives: resources, tools, prompts
  • ·Solves N×M integration problem for AI + tools
  • ·Spec-first, open-source SDKs in TS, Python, Go, Rust
If you are a
Builder
  • ·Add MCP server support to your app if you want AI integrations without per-client code
  • ·Use official SDKs · avoid custom protocol extensions
  • ·Check the registry at registry.modelcontextprotocol.io
If you are a
Investor
  • ·MCP is winning the agent-tool interop layer · dominant among open-source agents
  • ·Every major agent now ships MCP support
  • ·Long-term moat for whoever owns the canonical registry + tooling
If you are a
Curious · Normie
  • ·A universal adapter for AI · lets any AI talk to any tool
  • ·Like USB for AI · plug in anywhere
  • ·Created by Anthropic, used by almost every coding AI
Gecko's take

MCP is the AI-native USB-C. Every agent stack is converging on it. Build servers, not custom integrations.

Anthropic published the protocol and SDKs in late 2024. It is now an open standard with contributions from many organizations.