The Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This Python SDK implements the full MCP specification, making it easy to:
- Build MCP clients that can connect to any MCP server
- Create MCP servers that expose resources, prompts and tools
- Use standard transports like stdio, SSE, and Streamable HTTP
- Handle all MCP protocol messages and lifecycle events
We recommend using uv to manage your Python projects.
If you haven't created a uv-managed project yet, create one:
uv init mcp-server-demo
cd mcp-server-demoThen add MCP to your project dependencies:
uv add "mcp[cli]"Alternatively, for projects using pip for dependencies:
pip install "mcp[cli]"To run the mcp command with uv:
uv run mcpLet's create a simple MCP server that exposes a calculator tool and some data:
"""
FastMCP quickstart example.
Run from the repository root:
uv run examples/snippets/servers/fastmcp_quickstart.py
"""
from mcp.server.fastmcp import FastMCP
# Create an MCP server
mcp = FastMCP("Demo", json_response=True)
# Add an addition tool
@mcp.tool()
def add(a: int, b: int) -> int:
"""Add two numbers"""
return a + b
# Add a dynamic greeting resource
@mcp.resource("greeting://{name}")
def get_greeting(name: str) -> str:
"""Get a personalized greeting"""
return f"Hello, {name}!"
# Add a prompt
@mcp.prompt()
def greet_user(name: str, style: str = "friendly") -> str:
"""Generate a greeting prompt"""
styles = {
"friendly": "Please write a warm, friendly greeting",
"formal": "Please write a formal, professional greeting",
"casual": "Please write a casual, relaxed greeting",
}
return f"{styles.get(style, styles['friendly'])} for someone named {name}."
# Run with streamable HTTP transport
if __name__ == "__main__":
mcp.run(transport="streamable-http")Full example: examples/snippets/servers/fastmcp_quickstart.py
You can install this server in Claude Code and interact with it right away. First, run the server:
uv run --with mcp examples/snippets/servers/fastmcp_quickstart.pyThen add it to Claude Code:
claude mcp add --transport http my-server http://localhost:8000/mcpAlternatively, you can test it with the MCP Inspector. Start the server as above, then in a separate terminal:
npx -y @modelcontextprotocol/inspectorIn the inspector UI, connect to http://localhost:8000/mcp.
The Model Context Protocol (MCP) lets you build servers that expose data and functionality to LLM applications in a secure, standardized way. Think of it like a web API, but specifically designed for LLM interactions. MCP servers can:
- Expose data through Resources (think of these sort of like GET endpoints; they are used to load information into the LLM's context)
- Provide functionality through Tools (sort of like POST endpoints; they are used to execute code or otherwise produce a side effect)
- Define interaction patterns through Prompts (reusable templates for LLM interactions)
- And more!
- Building Servers -- tools, resources, prompts, logging, completions, sampling, elicitation, transports, ASGI mounting
- Writing Clients -- connecting to servers, using tools/resources/prompts, display utilities
- Authorization -- OAuth 2.1, token verification, client authentication
- Low-Level Server -- direct handler registration for advanced use cases
- Protocol Features -- MCP primitives, server capabilities
- Testing -- in-memory transport testing with pytest
- API Reference
- Experimental Features (Tasks)
- Model Context Protocol documentation
- Model Context Protocol specification
- Officially supported servers
We are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See the contributing guide to get started.
This project is licensed under the MIT License - see the LICENSE file for details.