Computer Use File Server + MCP

Recommended: LiteLLM MCP Proxy

Endpoint: http://localhost:8081/mcp/docker_ai

Use your regular LiteLLM API key (NOT the docker-ai internal key).

Available Tools

Usage Examples

Step 1: Initialize (obtain session ID)

curl -sD - -X POST "http://localhost:8081/mcp/docker_ai" \
  -H "Authorization: Bearer <LITELLM_API_KEY>" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -H "X-Chat-Id: my-unique-session" \
  -d '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0"}}}'

# Response contains header: mcp-session-id: <SESSION_ID>

Step 2: Call a tool

curl -s -X POST "http://localhost:8081/mcp/docker_ai" \
  -H "Authorization: Bearer <LITELLM_API_KEY>" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -H "Mcp-Session-Id: <SESSION_ID>" \
  -H "X-Chat-Id: my-unique-session" \
  -d '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"bash_tool","arguments":{"command":"echo Hello","description":"test"}}}'

Required Headers

HeaderDescription
AuthorizationBearer token from LiteLLM: Bearer <LITELLM_API_KEY>
Acceptapplication/json, text/event-stream
X-Chat-IdUnique session ID (creates a separate Docker container)
Mcp-Session-IdSession ID from the initialize response
X-MCP-Servers(optional) Comma-separated list of MCP servers for sub_agent (see below)

Usage via chat/completions

curl -X POST http://localhost:8081/v1/chat/completions \
  -H "Authorization: Bearer <LITELLM_API_KEY>" \
  -H "X-OpenWebUI-Chat-Id: my-chat-123" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude/claude-sonnet-4-5",
    "messages": [{"role": "user", "content": "Run ls -la"}],
    "tools": [{"type": "mcp", "server_label": "docker_ai"}]
  }'

MCP Servers for sub_agent

To allow sub_agent to use external MCP servers (Confluence, Jira, Grafana, etc.), pass their names via the X-MCP-Servers header.

How it works:

Available servers: check your LiteLLM instance for the list of registered MCP servers.

Format: comma-separated server names (as registered in LiteLLM):

X-MCP-Servers: confluence,jira

Example: sub_agent with Confluence and Jira MCP servers:

curl -s -X POST "http://localhost:8081/mcp/docker_ai" \
  -H "Authorization: Bearer <LITELLM_API_KEY>" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -H "Mcp-Session-Id: <SESSION_ID>" \
  -H "X-Chat-Id: my-session" \
  -H "X-MCP-Servers: confluence,jira" \
  -H "X-User-Email: user@example.com" \
  -d '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"sub_agent","arguments":{"task":"Find the onboarding page in Confluence","description":"search confluence"}}}'

Result: before launching Claude Code inside the container, a ~/.mcp.json file is created:

{
  "mcpServers": {
    "confluence": {
      "type": "http",
      "url": "http://localhost:8081/mcp/confluence",
      "headers": {
        "Authorization": "Bearer <ANTHROPIC_AUTH_TOKEN from env>",
        "x-openwebui-user-email": "user@example.com"
      }
    },
    "jira": { ... }
  }
}

If the X-MCP-Servers header is not provided, sub_agent works without MCP servers (as before).

System Prompt for Integrations

A ready-made system prompt that teaches the AI how to work with the Computer Use VM (files, skills, tools). Use it in n8n, your own agents, or any integrations.

Get the Prompt

# Template with placeholders {file_base_url} and {archive_url}
curl http://localhost:8081/system-prompt

# With substituted URLs (for a specific chat)
curl "http://localhost:8081/system-prompt?file_base_url=http://localhost:8081/files/MY_CHAT_ID&archive_url=http://localhost:8081/files/MY_CHAT_ID/archive"

Usage in n8n

Add an HTTP Request node with GET /system-prompt and insert the result into the system message of your AI agent. Substitute file_base_url and archive_url via query params or replace the placeholders in n8n.

File API

GET /files/{chat_id}/{filename}

Download a file from the container's outputs directory

GET /files/{chat_id}/archive

Download all files as a ZIP archive

POST /api/uploads/{chat_id}/{filename}

Upload a file to the container's uploads directory

GET /api/uploads/{chat_id}/manifest

Get the manifest of uploaded files (filename to MD5 mapping)

Direct Access to /mcp (for developers)

Warning: Direct access to localhost:8081/mcp requires a special API key. For normal usage, use the LiteLLM endpoint above.

Example (internal use only)

curl -X POST http://localhost:8081/mcp \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer <INTERNAL_MCP_API_KEY>" \
  -H "X-Chat-Id: my-session-123" \
  -d '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"bash_tool","arguments":{"command":"echo Hello","description":"test"}}}'

Architecture

┌─────────────────────────────────────────────────────────────────┐
│                        Docker Host                              │
│                                                                 │
│  ┌──────────────────┐      ┌─────────────────────────────────┐ │
│  │  File Server     │      │  Docker Containers              │ │
│  │  (this service)  │      │                                 │ │
│  │                  │      │  ┌─────────────────────────┐   │ │
│  │  POST /mcp ──────┼──────┼──│ owui-chat-{chat_id}     │   │ │
│  │                  │      │  │  - bash_tool            │   │ │
│  │  GET /files/* ───┼──────┼──│  - view                 │   │ │
│  │                  │      │  │  - create_file          │   │ │
│  │                  │      │  │  - str_replace          │   │ │
│  └──────────────────┘      │  └─────────────────────────┘   │ │
│           │                │                                 │ │
│           │                │  ┌─────────────────────────┐   │ │
│           ▼                │  │ owui-chat-{other_id}    │   │ │
│  /var/run/docker.sock      │  └─────────────────────────┘   │ │
│                            └─────────────────────────────────┘ │
│                                                                 │
│  /data/{chat_id}/                                              │
│    ├── uploads/   (files from user)                            │
│    └── outputs/   (files created by AI)                        │
└─────────────────────────────────────────────────────────────────┘