MCP Configuration Guide: YAML Setup and Configuration
What is MCP?
The Modular Copilot Platform (MCP) is a framework for deploying and managing AI assistants with modular capabilities. It allows you to create custom AI workflows with specialized tools and integrations.
Basic YAML Configuration
Minimal Configuration
# mcp-config.yml
version: "1.0"
server:
host: "0.0.0.0"
port: 5000
models:
- name: "default"
provider: "ollama"
model: "mistral:7b"
temperature: 0.7
tools:
- filesystem
- web_searchConfiguration Breakdown
Server Settings:
server:
host: "0.0.0.0" # Listen on all interfaces
port: 5000 # API port
workers: 4 # Number of worker processes
timeout: 120 # Request timeout (seconds)Model Configuration:
models:
- name: "code-assistant"
provider: "ollama" # Options: ollama, openai, anthropic
model: "codellama:7b"
temperature: 0.3 # Lower = more deterministic
max_tokens: 2048
context_window: 4096Tools and Capabilities:
tools:
- name: "filesystem"
enabled: true
config:
allowed_paths:
- "/home/user/documents"
- "/home/user/projects"
- name: "web_search"
enabled: true
api_key: "${SEARCH_API_KEY}"
- name: "code_execution"
enabled: false # Disabled for securityDocker Deployment
docker-compose.yml
version: '3.8'
services:
mcp:
image: modularcopilot/mcp:latest
container_name: mcp-server
restart: unless-stopped
ports:
- "5000:5000"
volumes:
- ./config:/app/config
- ./data:/app/data
environment:
- MCP_CONFIG_PATH=/app/config/mcp-config.yml
- OLLAMA_HOST=http://ollama:11434
depends_on:
- ollama
ollama:
image: ollama/ollama:latest
container_name: ollama
restart: unless-stopped
ports:
- "11434:11434"
volumes:
- ollama_data:/root/.ollama
volumes:
ollama_data:Deployment Commands
# Start services
docker-compose up -d
# View logs
docker-compose logs -f mcp
# Test API
curl http://localhost:5000/health
# Pull model
docker exec ollama ollama pull mistral:7bAdvanced Configuration
Multi-Model Setup
models:
- name: "general"
provider: "ollama"
model: "mistral:7b"
use_case: "general_queries"
- name: "coding"
provider: "ollama"
model: "codellama:13b"
use_case: "code_generation"
- name: "creative"
provider: "openai"
model: "gpt-4"
api_key: "${OPENAI_API_KEY}"
use_case: "creative_writing"Custom Tool Configuration
tools:
- name: "obsidian_integration"
type: "custom"
script: "/app/tools/obsidian_tool.py"
config:
vault_path: "/data/obsidian"
api_endpoint: "http://localhost:27124"
- name: "n8n_workflows"
type: "webhook"
url: "http://n8n:5678/webhook/mcp"
headers:
Authorization: "Bearer ${N8N_TOKEN}"Environment Variables
# .env file
MCP_LOG_LEVEL=INFO
OLLAMA_HOST=http://localhost:11434
OPENAI_API_KEY=sk-...
SEARCH_API_KEY=...
N8N_TOKEN=...Integration Examples
With n8n
// n8n HTTP Request Node
{
"method": "POST",
"url": "http://mcp:5000/api/chat",
"body": {
"model": "general",
"messages": [
{
"role": "user",
"content": "Summarize my Obsidian notes"
}
]
}
}With Obsidian
// Obsidian plugin integration
async function queryMCP(prompt: string) {
const response = await fetch('http://localhost:5000/api/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
model: 'general',
messages: [{ role: 'user', content: prompt }]
})
});
return await response.json();
}Security Best Practices
1. Authentication
server:
auth:
enabled: true
type: "bearer"
tokens:
- "${MCP_API_TOKEN}"2. Rate Limiting
server:
rate_limit:
enabled: true
requests_per_minute: 60
burst: 103. Tool Restrictions
tools:
- name: "filesystem"
enabled: true
security:
read_only: true
allowed_extensions: [".md", ".txt"]
max_file_size: "10MB"Troubleshooting
Common Issues
MCP won’t start:
# Check logs
docker logs mcp-server
# Validate config
docker exec mcp-server python -c "import yaml; yaml.safe_load(open('/app/config/mcp-config.yml'))"Model not loading:
# Pull model manually
docker exec ollama ollama pull mistral:7b
# Check available models
docker exec ollama ollama listConnection refused:
# Check if services are running
docker ps
# Test connectivity
curl http://localhost:5000/health
curl http://localhost:11434/api/versionConclusion
MCP provides a flexible framework for AI integration. Proper YAML configuration is key to reliable deployment. Start with basic setup, then gradually add tools and models as needed.
Key Takeaways: - Use environment variables for secrets - Start with minimal configuration - Test each tool before enabling - Monitor resource usage - Keep configurations in version control
For more information: MCP GitHub