Chapter 9: Building MCP Servers in Python
Python’s MCP Story
Python and MCP are a natural fit. The Python SDK was one of the first two official SDKs (alongside TypeScript), and the Python AI/ML ecosystem means there’s no shortage of interesting things to wrap in an MCP server.
The Python SDK provides two APIs:
- FastMCP — A high-level, decorator-based API inspired by FastAPI. This is what you’ll use 90% of the time.
- Low-level Server — Direct protocol control for when FastMCP doesn’t fit.
Setting Up
mkdir my-mcp-server
cd my-mcp-server
# Using uv (recommended)
uv init
uv add mcp
# Or using pip
pip install mcp
The Minimal Server
Create server.py:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("my-first-server")
@mcp.tool()
async def greet(name: str) -> str:
"""Generate a greeting for the given name.
Args:
name: The name of the person to greet
"""
return f"Hello, {name}! Welcome to the world of MCP."
if __name__ == "__main__":
mcp.run()
That’s it. Twelve lines. Run it:
python server.py
# Or with uv:
uv run server.py
The server starts, listens on stdio, and is ready to accept MCP connections.
FastMCP uses Python’s type hints and docstrings to generate JSON Schema descriptions automatically. The function signature async def greet(name: str) -> str becomes a tool with a required name parameter of type string. The docstring becomes the tool description. Python’s introspection is doing a lot of heavy lifting here.
FastMCP Deep Dive
Tools with FastMCP
from mcp.server.fastmcp import FastMCP
from typing import Optional
import httpx
mcp = FastMCP("weather-server")
@mcp.tool()
async def get_weather(
city: str,
units: Optional[str] = "celsius",
) -> str:
"""Get current weather information for a city.
Returns temperature, conditions, humidity, and wind speed.
Args:
city: City name (e.g., 'London', 'New York', 'Tokyo')
units: Temperature units - 'celsius' or 'fahrenheit'
"""
import os
api_key = os.environ.get("WEATHER_API_KEY")
if not api_key:
raise ValueError("WEATHER_API_KEY environment variable not set")
unit_param = "imperial" if units == "fahrenheit" else "metric"
async with httpx.AsyncClient() as client:
response = await client.get(
"https://api.openweathermap.org/data/2.5/weather",
params={
"q": city,
"units": unit_param,
"appid": api_key,
},
)
response.raise_for_status()
data = response.json()
temp_unit = "°F" if units == "fahrenheit" else "°C"
return "\n".join([
f"Weather for {data['name']}, {data['sys']['country']}:",
f"Temperature: {data['main']['temp']}{temp_unit}",
f"Conditions: {data['weather'][0]['description']}",
f"Humidity: {data['main']['humidity']}%",
f"Wind: {data['wind']['speed']} m/s",
])
Notice:
- Type hints become JSON Schema —
strbecomes{"type": "string"},Optional[str]becomes a non-required string, etc. - Docstrings become descriptions — The function docstring is the tool description. Arg descriptions from the
Args:section become parameter descriptions. - Return values are auto-wrapped — Return a string and FastMCP wraps it in a
TextContentresponse. Return a list for multiple content items. - Exceptions become errors — Unhandled exceptions are caught and returned as tool execution errors with
isError: true.
Tools with Annotations
@mcp.tool(
annotations={
"title": "Web Search",
"readOnlyHint": True,
"openWorldHint": True,
}
)
async def search_web(query: str) -> str:
"""Search the web for information.
Args:
query: The search query
"""
# ... implementation
Resources with FastMCP
@mcp.resource("config://app")
async def get_config() -> str:
"""Application configuration."""
import json
config = {
"version": "1.0.0",
"environment": "production",
"features": ["auth", "logging", "cache"],
}
return json.dumps(config, indent=2)
@mcp.resource("file:///{path}")
async def read_file(path: str) -> str:
"""Read a file from the filesystem.
Args:
path: Absolute file path
"""
with open(f"/{path}", "r") as f:
return f.read()
The first resource has a fixed URI. The second is a template—{path} is a parameter extracted from the URI.
Prompts with FastMCP
from mcp.server.fastmcp.prompts import base
@mcp.prompt()
async def code_review(code: str, language: str = "python") -> list[base.Message]:
"""Perform a thorough code review.
Args:
code: The code to review
language: Programming language
"""
return [
base.UserMessage(
content=f"Please review this {language} code for correctness, "
f"performance, and maintainability:\n\n```{language}\n{code}\n```\n\n"
f"For each issue, provide severity, description, and suggested fix."
)
]
A Complete Example: Database Explorer
Let’s build something more substantial—a server that lets an LLM explore and query a SQLite database:
import sqlite3
from pathlib import Path
from typing import Optional
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("sqlite-explorer")
# Database path from environment or default
DB_PATH = Path(__file__).parent / "data.db"
def get_connection() -> sqlite3.Connection:
conn = sqlite3.connect(str(DB_PATH))
conn.row_factory = sqlite3.Row
return conn
@mcp.tool()
async def list_tables() -> str:
"""List all tables in the database with their row counts."""
conn = get_connection()
try:
cursor = conn.execute(
"SELECT name FROM sqlite_master WHERE type='table' ORDER BY name"
)
tables = [row["name"] for row in cursor.fetchall()]
results = []
for table in tables:
count = conn.execute(f"SELECT COUNT(*) as c FROM [{table}]").fetchone()["c"]
results.append(f" {table}: {count} rows")
return f"Tables in database:\n" + "\n".join(results)
finally:
conn.close()
@mcp.tool()
async def describe_table(table_name: str) -> str:
"""Get the schema of a specific table.
Args:
table_name: Name of the table to describe
"""
conn = get_connection()
try:
cursor = conn.execute(f"PRAGMA table_info([{table_name}])")
columns = cursor.fetchall()
if not columns:
return f"Table '{table_name}' not found."
lines = [f"Schema for '{table_name}':"]
for col in columns:
nullable = "NULL" if not col["notnull"] else "NOT NULL"
pk = " PRIMARY KEY" if col["pk"] else ""
default = f" DEFAULT {col['dflt_value']}" if col["dflt_value"] else ""
lines.append(f" {col['name']}: {col['type']} {nullable}{pk}{default}")
return "\n".join(lines)
finally:
conn.close()
@mcp.tool()
async def query(
sql: str,
limit: Optional[int] = 100,
) -> str:
"""Execute a read-only SQL query and return results.
Only SELECT queries are allowed. Results are limited by default.
Args:
sql: The SQL SELECT query to execute
limit: Maximum number of rows to return (default 100)
"""
# Safety check: only allow SELECT queries
stripped = sql.strip().upper()
if not stripped.startswith("SELECT"):
return "Error: Only SELECT queries are allowed. Use list_tables and describe_table for schema exploration."
conn = get_connection()
try:
# Add LIMIT if not present
if "LIMIT" not in stripped:
sql = f"{sql.rstrip(';')} LIMIT {limit}"
cursor = conn.execute(sql)
rows = cursor.fetchall()
columns = [description[0] for description in cursor.description]
if not rows:
return "Query returned no results."
# Format as a table
lines = [" | ".join(columns)]
lines.append("-" * len(lines[0]))
for row in rows:
lines.append(" | ".join(str(row[col]) for col in columns))
return f"Query returned {len(rows)} rows:\n\n" + "\n".join(lines)
except sqlite3.Error as e:
return f"SQL Error: {e}"
finally:
conn.close()
@mcp.resource("db://schema")
async def database_schema() -> str:
"""Complete database schema."""
conn = get_connection()
try:
cursor = conn.execute(
"SELECT sql FROM sqlite_master WHERE type='table' ORDER BY name"
)
schemas = [row["sql"] for row in cursor.fetchall() if row["sql"]]
return "\n\n".join(schemas)
finally:
conn.close()
@mcp.prompt()
async def analyze_table(table_name: str) -> str:
"""Create a prompt to analyze a database table.
Args:
table_name: The table to analyze
"""
return (
f"Please analyze the '{table_name}' table in my database. "
f"First use list_tables to see what's available, then describe_table "
f"to understand the schema, and finally run some exploratory queries "
f"to understand the data distribution, identify any data quality issues, "
f"and suggest useful queries for common tasks."
)
if __name__ == "__main__":
mcp.run()
This server exposes:
- Three tools:
list_tables,describe_table, andquery - One resource: the complete database schema
- One prompt: a guided data analysis workflow
An LLM connected to this server can explore the database interactively—discovering tables, understanding schemas, and running queries—all through natural conversation.
The Low-Level API
When FastMCP’s magic is too much magic, use the low-level API:
import asyncio
from mcp.server import Server
from mcp.server.stdio import stdio_server
from mcp.types import (
Tool,
TextContent,
CallToolResult,
)
server = Server("low-level-server")
@server.list_tools()
async def list_tools() -> list[Tool]:
return [
Tool(
name="echo",
description="Echoes the input back",
inputSchema={
"type": "object",
"properties": {
"message": {
"type": "string",
"description": "The message to echo",
}
},
"required": ["message"],
},
)
]
@server.call_tool()
async def call_tool(name: str, arguments: dict) -> list[TextContent]:
if name == "echo":
return [TextContent(type="text", text=f"Echo: {arguments['message']}")]
raise ValueError(f"Unknown tool: {name}")
async def main():
async with stdio_server() as (read_stream, write_stream):
await server.run(read_stream, write_stream, server.create_initialization_options())
if __name__ == "__main__":
asyncio.run(main())
Running with uvx
The uvx command runs Python packages in isolated environments without installing them globally. It’s the Python equivalent of npx:
# Run directly
uvx my-mcp-server
# Or in a config file
{
"mcpServers": {
"my-server": {
"command": "uvx",
"args": ["my-mcp-server"]
}
}
}
To make your server compatible with uvx, add a [project.scripts] section to pyproject.toml:
[project.scripts]
my-mcp-server = "my_mcp_server:main"
And in your server module:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("my-server")
# ... register tools, resources, prompts ...
def main():
mcp.run()
if __name__ == "__main__":
main()
HTTP Transport
To serve your FastMCP server over HTTP:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("remote-server")
# ... register tools ...
if __name__ == "__main__":
mcp.run(transport="streamable-http", host="0.0.0.0", port=8000)
Or with more control using Starlette/ASGI:
import uvicorn
from starlette.applications import Starlette
from starlette.routing import Route
from mcp.server.fastmcp import FastMCP
from mcp.server.streamable_http import StreamableHTTPServerTransport
mcp = FastMCP("remote-server")
# ... register tools ...
async def handle_mcp(request):
transport = StreamableHTTPServerTransport(
session_id_generator=lambda: str(uuid4()),
)
# Handle the MCP request
...
app = Starlette(
routes=[
Route("/mcp", handle_mcp, methods=["GET", "POST", "DELETE"]),
],
)
if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=8000)
Patterns and Best Practices
Async All the Way
FastMCP tools should be async. If you need to call synchronous code:
import asyncio
@mcp.tool()
async def cpu_intensive_task(data: str) -> str:
"""Run a CPU-intensive operation."""
loop = asyncio.get_event_loop()
result = await loop.run_in_executor(None, process_data, data)
return result
Context and Dependency Injection
FastMCP provides a context object for accessing MCP features within tools:
from mcp.server.fastmcp import Context
@mcp.tool()
async def smart_tool(query: str, ctx: Context) -> str:
"""A tool that uses MCP context features."""
# Log a message
await ctx.info(f"Processing query: {query}")
# Report progress
await ctx.report_progress(0, 100, "Starting...")
# ... do work ...
await ctx.report_progress(100, 100, "Done!")
return "Result"
Type-Rich Schemas
Use Python’s type system to generate rich schemas:
from enum import Enum
from typing import Optional
from pydantic import BaseModel, Field
class Priority(str, Enum):
LOW = "low"
MEDIUM = "medium"
HIGH = "high"
@mcp.tool()
async def create_task(
title: str,
description: str = "",
priority: Priority = Priority.MEDIUM,
tags: list[str] = [],
) -> str:
"""Create a new task.
Args:
title: Task title
description: Detailed description
priority: Task priority level
tags: Tags to categorize the task
"""
# ... implementation
Enums become JSON Schema enums. Lists become arrays. Optional types become non-required fields. Pydantic models become nested objects.
Testing
Test your tools as regular async functions:
import pytest
@pytest.mark.asyncio
async def test_greet():
result = await greet("World")
assert "World" in result
@pytest.mark.asyncio
async def test_list_tables():
result = await list_tables()
assert "Tables in database:" in result
For integration testing with the full MCP protocol, use the SDK’s test utilities:
from mcp.client.session import ClientSession
from mcp.client.stdio import stdio_client
async def test_server_integration():
async with stdio_client(
command="python",
args=["server.py"],
) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
# List tools
tools = await session.list_tools()
assert len(tools.tools) > 0
# Call a tool
result = await session.call_tool("greet", {"name": "Test"})
assert "Test" in result.content[0].text
Publishing to PyPI
Package your server and publish it:
# pyproject.toml
[project]
name = "my-mcp-server"
version = "1.0.0"
description = "An MCP server that does awesome things"
requires-python = ">=3.10"
dependencies = [
"mcp>=1.0.0",
"httpx>=0.25.0",
]
[project.scripts]
my-mcp-server = "my_mcp_server.server:main"
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
# Build and publish
uv build
uv publish
Users can then:
# Run with uvx
uvx my-mcp-server
# Or install and run
pip install my-mcp-server
my-mcp-server
Summary
The Python MCP SDK provides FastMCP, a high-level decorator-based API that turns Python functions into MCP tools with minimal boilerplate. Type hints become schemas, docstrings become descriptions, and exceptions become errors.
For more control, the low-level Server API offers direct protocol access. Both APIs support stdio and HTTP transports, making it easy to build local development tools or remote production services.
Python’s rich ecosystem of data science, web, and automation libraries makes it an excellent choice for building MCP servers that wrap databases, APIs, ML models, and more.
Next: building the other side of the connection—MCP clients.