Files

MCP Servers — Timmy's Perception & Action Layer

Two off-the-shelf MCP servers replace all custom perception and action code. Zero lines of infrastructure. pip install and config.

Architecture

Ollama (DPO model)
    ↓ tool_calls (Hermes protocol)
    ↓
MCP Client (heartbeat loop)
    ├── steam-info-mcp     → game perception (playtime, achievements, friends)
    └── mcp-pyautogui      → desktop action (screenshot, keypress, mouse)

The heartbeat loop is the MCP client. It:

  1. Calls tools/list on each MCP server at startup to discover available tools
  2. Passes tool schemas to Ollama via the tools parameter
  3. When the model returns tool_calls, executes them via tools/call on the right server
  4. Feeds results back to the model as tool role messages

Servers

steam-info-mcp (#545)

What: Steam Web API exposed as MCP tools. Timmy can see what games are installed, what's been played recently, achievements, friends, news.

Package: steam-info-mcp

Tools available:

Tool Description
steam-owned-games List all owned games
steam-recently-played Recently played games + hours
steam-player-achievements Achievements for a game
steam-user-stats Player stats for a game
steam-current-players Live player count for a game
steam-news Latest news for a game
steam-player-summaries Player profile info
steam-friend-list Friends list
steam-level Steam level
steam-badges Badge collection

Requires: STEAM_API_KEY env var. Get one at https://steamcommunity.com/dev/apikey

Run: steam-info-mcp (stdio transport)

mcp-pyautogui (#546)

What: Desktop control via PyAutoGUI exposed as MCP tools. This IS the execute_action() implementation — no wrapper needed.

Package: mcp-pyautogui

Tools available:

Tool Description
take_screenshot Capture screen to file
click Left-click at (x, y)
right_click Right-click at (x, y)
double_click Double-click at (x, y)
move_to Move mouse to (x, y)
drag_to Drag mouse to (x, y)
type_text Type a string
press_key Press a single key
hotkey Key combo (e.g., "ctrl c")
scroll Scroll up/down
get_mouse_position Current mouse (x, y)
get_screen_size Screen resolution
pixel_color RGB at pixel (x, y)
get_os Current OS name

Requires: macOS Accessibility permissions for Terminal / Python process. System Settings → Privacy & Security → Accessibility.

Run: mcp-pyautogui (stdio transport)

Setup

cd ~/.timmy/timmy-config/mcp
bash setup.sh

How Ollama Connects

Both servers communicate over stdio — they read JSON-RPC from stdin and write to stdout. The heartbeat loop spawns each server as a subprocess and talks to it over pipes.

Ollama's native tool-calling works like this:

import ollama
import subprocess, json

# 1. Spawn MCP server
proc = subprocess.Popen(
    ["mcp-pyautogui"],
    stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE
)

# 2. Discover tools (JSON-RPC over stdio)
request = {"jsonrpc": "2.0", "id": 1, "method": "tools/list"}
proc.stdin.write(json.dumps(request).encode() + b"\n")
proc.stdin.flush()
tools = json.loads(proc.stdout.readline())

# 3. Pass tool schemas to Ollama
response = ollama.chat(
    model="timmy:v0.2-dpo",
    messages=[{"role": "user", "content": "Take a screenshot"}],
    tools=[...convert MCP tools to Ollama format...]
)

# 4. Execute tool calls via MCP
for call in response["message"]["tool_calls"]:
    mcp_request = {
        "jsonrpc": "2.0", "id": 2,
        "method": "tools/call",
        "params": {"name": call["function"]["name"], "arguments": call["function"]["arguments"]}
    }
    proc.stdin.write(json.dumps(mcp_request).encode() + b"\n")
    proc.stdin.flush()
    result = json.loads(proc.stdout.readline())

This is pseudocode. The actual heartbeat loop (#547) will be ~30 lines of glue connecting Ollama's tool-calling API to MCP's stdio protocol. No custom infrastructure.

What We Don't Own

  • Steam API integration → steam-info-mcp (beta/steam-info-mcp on GitHub)
  • Desktop automation → mcp-pyautogui (PyAutoGUI wrapper)
  • MCP protocol → JSON-RPC 2.0 over stdio (industry standard)
  • Tool calling → Ollama native (Hermes protocol)
  • Model serving → Ollama

What We Own

  • servers.json — which servers to run and their env vars
  • setup.sh — one-command install
  • This README