How to Build an MCP Server: Step-by-Step with Code Examples

The Model Context Protocol (MCP) is becoming the new standard for connecting AI models to real-world tools and services. Building an MCP server allows you to expose data, actions, and resources to an LLM like Claude through a simple, standardized interface.
In this guide, you will learn step-by-step how to set up a basic MCP server in Python, define resources and tools, and connect it to an MCP client.
Key Takeaways
- MCP servers allow AI models to interact with external systems through standardized resources and tools.
- You can build an MCP server in Python using the official SDK.
- A minimal working server can expose both read-only data (resources) and executable actions (tools).
- Security and error handling are critical for production deployments.
What is an MCP Server
An MCP server acts as a bridge between an LLM and an external system like a database, file storage, or API. It defines resources (readable data), tools (actions), and prompts (instructions) in a way that the LLM can safely use during its tasks.
Instead of writing a custom integration for each model or tool, MCP offers a universal standard that works with protocol version 0.1 (current as of April 2025).
What you beed before starting
- Python 3.8 or later
- Basic experience with Python scripting
- MCP SDK for Python (available via pip)
- An MCP-compatible client like Claude Desktop or Cursor (optional for testing)
- Git for version control (recommended)
- A text editor or IDE (Visual Studio Code recommended)
Understanding the core structure
In MCP:
- Server: Provides resources and tools to the LLM.
- Client: Connects the LLM to your server.
- Protocol: Manages communication between client and server.
You will define two important primitives:
- Resource: Static or dynamic information the LLM can read.
- Tool: A callable function the LLM can execute.
The communication flow works as follows:
- The LLM (via a client) requests data or actions from your server
- Your server processes these requests and returns standardized responses
- The LLM can then use this information in its reasoning and responses
1. Set up your python project
Start by creating a project directory and a Python virtual environment.
mkdir my_mcp_server
cd my_mcp_server
python -m venv venv
source venv/bin/activate # Linux/Mac
venvScriptsactivate # Windows
Create a basic project structure:
mkdir -p src/resources src/tools tests
touch src/__init__.py src/resources/__init__.py src/tools/__init__.py
touch requirements.txt README.md
Add the following to your requirements.txt
:
mcp-server>=0.1.0
pydantic>=2.0.0
pytest>=7.0.0
2. Install the MCP SDK
Install the MCP server SDK for Python and other dependencies:
pip install -r requirements.txt
If the official SDK is not published yet, you may need to install from a GitHub repository:
pip install git+https://github.com/anthropic/mcp-server-python.git
3. Create a basic MCP server
Create a file called src/server.py
:
from typing import Dict, Any
from mcp_server import MCPServer
import logging
# Configure logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger("mcp_server")
def main() -> None:
"""Initialize and start the MCP server."""
try:
server = MCPServer(
name="MyMCPServer",
version="0.1.0",
description="A simple MCP server example"
)
# Resources and tools will be added here
logger.info("Starting MCP server...")
server.start()
except Exception as e:
logger.error(f"Failed to start MCP server: {e}")
raise
if __name__ == "__main__":
main()
This sets up a basic MCP server with proper logging.
4. Define a resource
Resources expose data the model can read. Let’s create a file src/resources/user_profiles.py
:
from typing import List, Dict, Any
from pydantic import BaseModel
import logging
logger = logging.getLogger("mcp_server.resources")
class UserProfile(BaseModel):
"""Data model for user profiles."""
name: str
role: str
department: str = "General"
years_experience: int = 0
def fetch_user_profiles() -> List[Dict[str, Any]]:
"""
Fetch user profiles from the database.
Returns:
List[Dict[str, Any]]: A list of user profile dictionaries.
"""
try:
# In a real implementation, this would query a database
# For this example, we'll return mock data
users = [
UserProfile(name="Alice", role="Engineer", department="Engineering", years_experience=5),
UserProfile(name="Bob", role="Product Manager", department="Product", years_experience=3),
UserProfile(name="Charlie", role="Designer", department="Design", years_experience=7)
]
logger.info(f"Successfully fetched {len(users)} user profiles")
return [user.model_dump() for user in users]
except Exception as e:
logger.error(f"Error fetching user profiles: {e}")
# In production, you might want to return an empty list or raise
# a specific exception depending on your error handling strategy
return []
Now update src/server.py
to include this resource:
from typing import Dict, Any
from mcp_server import MCPServer, Resource
import logging
from src.resources.user_profiles import fetch_user_profiles
# ... existing code ...
def main() -> None:
"""Initialize and start the MCP server."""
try:
server = MCPServer(
name="MyMCPServer",
version="0.1.0",
description="A simple MCP server example"
)
# Add the user profiles resource
user_profiles = Resource(
name="user_profiles",
description="List of user profiles from the company database.",
fetch_fn=fetch_user_profiles
)
server.add_resource(user_profiles)
logger.info("Starting MCP server...")
server.start()
except Exception as e:
logger.error(f"Failed to start MCP server: {e}")
raise
if __name__ == "__main__":
main()
The LLM can now query user_profiles
via the MCP client.
5. Define a tool
Tools allow the LLM to execute an action. Create a file src/tools/user_management.py
:
from typing import Dict, Any, Optional
from pydantic import BaseModel, Field, ValidationError
import logging
logger = logging.getLogger("mcp_server.tools")
class CreateUserRequest(BaseModel):
"""Validation model for user creation requests."""
name: str = Field(..., min_length=2, description="User's full name")
role: str = Field(..., min_length=2, description="User's job role")
department: Optional[str] = Field("General", description="User's department")
years_experience: Optional[int] = Field(0, ge=0, description="Years of professional experience")
def create_user_profile(request_data: Dict[str, Any]) -> Dict[str, Any]:
"""
Create a new user profile in the database.
Args:
request_data (Dict[str, Any]): User data containing name, role, etc.
Returns:
Dict[str, Any]: Response with status and user info
"""
try:
# Validate the input data
user_data = CreateUserRequest(**request_data)
# In a real implementation, this would insert into a database
# For this example, we'll just log the action
logger.info(f"Creating new user: {user_data.name} - {user_data.role} in {user_data.department}")
# Return success response with created user data
return {
"status": "success",
"message": f"User {user_data.name} created successfully",
"user": user_data.model_dump()
}
except ValidationError as e:
# Handle validation errors
logger.error(f"Validation error: {e}")
return {
"status": "error",
"message": "Invalid user data provided",
"details": str(e)
}
except Exception as e:
# Handle other errors
logger.error(f"Error creating user: {e}")
return {
"status": "error",
"message": "Failed to create user",
"details": str(e)
}
Now update src/server.py
to include this tool:
from typing import Dict, Any
from mcp_server import MCPServer, Resource, Tool
import logging
from src.resources.user_profiles import fetch_user_profiles
from src.tools.user_management import create_user_profile
# ... existing code ...
def main() -> None:
"""Initialize and start the MCP server."""
try:
server = MCPServer(
name="MyMCPServer",
version="0.1.0",
description="A simple MCP server example"
)
# Add the user profiles resource
user_profiles = Resource(
name="user_profiles",
description="List of user profiles from the company database.",
fetch_fn=fetch_user_profiles
)
# Add the create user tool
create_user = Tool(
name="create_user_profile",
description="Create a new user profile in the database.",
parameters={
"name": {"type": "string", "description": "User's full name"},
"role": {"type": "string", "description": "User's job role"},
"department": {"type": "string", "description": "User's department (optional)"},
"years_experience": {"type": "integer", "description": "Years of experience (optional)"}
},
execute_fn=create_user_profile
)
server.add_resource(user_profiles)
server.add_tool(create_user)
logger.info("Starting MCP server...")
server.start()
except Exception as e:
logger.error(f"Failed to start MCP server: {e}")
raise
6. Error handling and validation
Create a file src/utils/validation.py
to centralize your validation logic:
from typing import Dict, Any, List, Optional, Type
from pydantic import BaseModel, ValidationError
import logging
logger = logging.getLogger("mcp_server.validation")
def validate_request(
data: Dict[str, Any],
model_class: Type[BaseModel]
) -> tuple[Optional[BaseModel], Optional[Dict[str, Any]]]:
"""
Validate request data against a Pydantic model.
Args:
data: The input data to validate
model_class: The Pydantic model class to use for validation
Returns:
tuple: (validated_model, error_dict)
- If valid: (model instance, None)
- If invalid: (None, error dictionary)
"""
try:
validated_data = model_class(**data)
return validated_data, None
except ValidationError as e:
errors = e.errors()
error_dict = {
"status": "error",
"message": "Validation failed",
"errors": errors
}
logger.error(f"Validation error: {errors}")
return None, error_dict
This utility function can be used in all your tools to validate input data consistently.
7. Run and test the MCP server
Create a simple test script test_server.py
to verify your server works:
import requests
import json
import time
import subprocess
import sys
from pathlib import Path
def test_server():
"""Simple test to verify the MCP server is running correctly."""
# Start the server in a separate process
server_process = subprocess.Popen([sys.executable, "src/server.py"])
try:
# Wait for server to start
time.sleep(2)
# Test the server using the MCP client
# In a real test, you would use the MCP client SDK
# For this example, we'll simulate a client using HTTP requests
# Assuming the server is running on localhost:8000
base_url = "http://localhost:8000"
# Test fetching resources
response = requests.get(f"{base_url}/resources/user_profiles")
assert response.status_code == 200
data = response.json()
print("Resource response:", json.dumps(data, indent=2))
# Test executing a tool
tool_data = {
"name": "Test User",
"role": "Tester",
"department": "QA"
}
response = requests.post(
f"{base_url}/tools/create_user_profile",
json=tool_data
)
assert response.status_code == 200
data = response.json()
print("Tool response:", json.dumps(data, indent=2))
print("All tests passed!")
finally:
# Clean up: terminate the server process
server_process.terminate()
server_process.wait()
if __name__ == "__main__":
test_server()
Run your server:
python src/server.py
In a separate terminal, you might see output like this when the server is running:
2025-04-28 10:15:23 - mcp_server - INFO - Starting MCP server...
2025-04-28 10:15:23 - mcp_server - INFO - Server listening on 0.0.0.0:8000
2025-04-28 10:15:30 - mcp_server.resources - INFO - Successfully fetched 3 user profiles
2025-04-28 10:15:35 - mcp_server.tools - INFO - Creating new user: Test User - Tester in QA
Then configure your MCP client (such as Claude Desktop) to connect to your local MCP server by providing the server URL or the command to start the server.
Security considerations
When deploying an MCP server, consider these security best practices:
- Authentication: Implement API keys or OAuth to authenticate clients.
def authenticate_request(request):
api_key = request.headers.get("X-API-Key")
if not api_key or api_key != os.environ.get("MCP_API_KEY"):
raise ValueError("Invalid API key")
- Input Validation: Always validate all inputs using Pydantic models.
- Rate Limiting: Implement rate limiting to prevent abuse.
- HTTPS: Always use HTTPS in production.
- Restricted Actions: Define clear boundaries for what tools can do.
Performance optimization
- Caching: Cache expensive resource fetches:
from functools import lru_cache
@lru_cache(maxsize=128, ttl=300) # Cache for 5 minutes
def fetch_user_profiles():
# Expensive database query
pass
- Async Processing: Use async for I/O bound operations:
async def fetch_user_profiles():
async with aiohttp.ClientSession() as session:
async with session.get("https://api.example.com/users") as response:
data = await response.json()
return data
- Connection Pooling: Use connection pools for database access.
Deployment
Local development
For local development, run:
python src/server.py
Docker deployment
Create a Dockerfile
:
FROM python:3.10-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
EXPOSE 8000
CMD ["python", "src/server.py"]
Build and run:
docker build -t mcp-server .
docker run -p 8000:8000 mcp-server
Cloud deployment (AWS)
- Create an EC2 instance or use AWS App Runner
- Deploy your Docker container
- Set up an Application Load Balancer
- Configure security groups to restrict access
Testing your MCP server
Create a test file tests/test_resources.py
:
import pytest
from src.resources.user_profiles import fetch_user_profiles
def test_fetch_user_profiles():
"""Test that user profiles can be fetched successfully."""
profiles = fetch_user_profiles()
# Check structure
assert isinstance(profiles, list)
assert len(profiles) > 0
# Check content
first_profile = profiles[0]
assert "name" in first_profile
assert "role" in first_profile
assert isinstance(first_profile["name"], str)
Run tests with:
pytest
Common errors and troubleshooting
Problem Solution Example Cannot connect to MCP server Check that your server is running and the port is correct netstat -tulpn | grep 8000
LLM cannot find resources Verify the name
and description
fields are properly set Check your Resource initialization Errors in tool execution Validate that input parameters match expected types Use Pydantic for validation Client cannot parse output Make sure your functions return JSON serializable data Use .model_dump()
instead of custom objects Server crashes on startup Check your imports and environment variables Set DEBUG=True
for verbose logging Tool timeout Add timeout handling for external API calls Use asyncio.wait_for()
with a timeout Authentication failures Verify API keys and permissions Check request headers XML/JSON parsing errors Use proper content type headers Set Content-Type: application/json
Next steps
After building your basic MCP server, consider these advanced extensions:
- Database Integration: Connect to PostgreSQL, MongoDB, or other databases.
- File Operations: Add tools for file reading, writing, and transformation.
- External APIs: Integrate with popular services like GitHub, Slack, or Google Drive.
- Webhooks: Allow the LLM to trigger events in other systems.
- Streaming Resources: Support streaming large datasets.
- Context-Aware Actions: Add tools that understand the LLM’s current context.
Example: Adding a database connection
import psycopg2
from contextlib import contextmanager
@contextmanager
def get_db_connection():
"""Create a database connection context manager."""
conn = None
try:
conn = psycopg2.connect(
host=os.environ.get("DB_HOST"),
database=os.environ.get("DB_NAME"),
user=os.environ.get("DB_USER"),
password=os.environ.get("DB_PASSWORD")
)
yield conn
finally:
if conn is not None:
conn.close()
def fetch_user_profiles_from_db():
"""Fetch user profiles from a PostgreSQL database."""
with get_db_connection() as conn:
with conn.cursor() as cur:
cur.execute("SELECT name, role, department FROM users")
columns = [desc[0] for desc in cur.description]
return [dict(zip(columns, row)) for row in cur.fetchall()]
Conclusion
Building a simple MCP server in Python opens the door to making LLMs much more powerful. By exposing data and actions through a clean, standardized protocol, you make it easier for AI systems to interact safely and meaningfully with external services.
Start small, focus on one resource and one tool, and you can expand over time into more advanced use cases like databases, cloud storage, or internal APIs.
The MCP ecosystem is growing rapidly, and implementing these standards now will position your applications to benefit from improvements in both the protocol and the LLMs that use it.
FAQs
Some experience with Python is required. MCP servers are software processes that need correct definitions for resources and tools.
Yes. Anthropic and contributors are releasing SDKs for multiple languages including Python and TypeScript.
Yes. You can host your MCP server on cloud platforms, behind firewalls, and make it available securely to your LLM clients.