Compare commits
1 Commits
fix/518
...
step35/466
| Author | SHA1 | Date | |
|---|---|---|---|
| becdb7312b |
144
docs/LOCAL_HARDWARE_MCP.md
Normal file
144
docs/LOCAL_HARDWARE_MCP.md
Normal file
@@ -0,0 +1,144 @@
|
||||
# Local Hardware MCP Integration
|
||||
|
||||
Integrate the Model Context Protocol (MCP) to allow Timmy agents to control local hardware securely: file system, smart home (Hue lights), and system information.
|
||||
|
||||
## Components
|
||||
|
||||
- **MCP Server**: `scripts/hardware_mcp_server.py` — stdio-based MCP server exposing 8 tools
|
||||
- **Config Template**: `timmy-local/hardware_mcp_config.yaml` — runtime tuning
|
||||
- **Smoke Tests**: `tests/test_hardware_mcp_server.py`
|
||||
|
||||
## Prerequisites
|
||||
|
||||
```bash
|
||||
# MCP SDK
|
||||
pip install mcp
|
||||
|
||||
# OpenHue CLI (for smart home control)
|
||||
brew install openhue/cli/openhue # macOS
|
||||
# or see: https://github.com/openhue/openhue-cli
|
||||
|
||||
# Optional: psutil for detailed system_info
|
||||
pip install psutil
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Start the MCP server
|
||||
|
||||
The server runs as a subprocess launched by Hermes Agent via the native-MCP integration.
|
||||
|
||||
Add to `~/.hermes/config.yaml`:
|
||||
|
||||
```yaml
|
||||
mcp_servers:
|
||||
hardware:
|
||||
command: "python"
|
||||
args: ["/full/path/to/timmy-home/scripts/hardware_mcp_server.py"]
|
||||
# Optional: add env vars if needed
|
||||
# env:
|
||||
# OPENHUE_BRIDGE_IP: "192.168.1.100"
|
||||
```
|
||||
|
||||
### 2. Restart Hermes
|
||||
|
||||
On startup, Hermes will:
|
||||
1. Launch the hardware MCP server
|
||||
2. Discover all 8 tools
|
||||
3. Register them with `hardware_*` prefixes (e.g., `hardware_file_read`, `hardware_light_control`)
|
||||
|
||||
### 3. Use in conversation
|
||||
|
||||
```
|
||||
User: Read my Timmy report file.
|
||||
Agent: [calls hardware_file_read with path="~/LOCAL_Timmy_REPORT.md"]
|
||||
|
||||
User: Turn off the bedroom lights.
|
||||
Agent: [calls hardware_light_control with name="Bedroom Lamp", on=false]
|
||||
|
||||
User: List files in my downloads folder.
|
||||
Agent: [calls hardware_file_list with directory="~/Downloads"]
|
||||
|
||||
User: What's my system status?
|
||||
Agent: [calls hardware_system_info]
|
||||
```
|
||||
|
||||
## Tool Reference
|
||||
|
||||
| Tool | Purpose | Parameters |
|
||||
|------|---------|------------|
|
||||
| `hardware_file_read` | Read file (≤10 MB) from home/tmp | `path` (string) |
|
||||
| `hardware_file_write` | Write text file | `path`, `content` |
|
||||
| `hardware_file_list` | List directory contents | `directory` (default: ~) |
|
||||
| `hardware_light_list` | List all Hue lights/rooms/scenes | none |
|
||||
| `hardware_light_control` | Control individual light | `name`, `on`, `brightness`, `color`, `temperature` |
|
||||
| `hardware_room_control` | Control all lights in a room | `name`, `on`, `brightness` |
|
||||
| `hardware_scene_set` | Activate Hue scene | `scene`, `room` |
|
||||
| `hardware_system_info` | System info (OS, CPU, memory, disk) | none |
|
||||
|
||||
## Security Model
|
||||
|
||||
- **File path allowlist**: Only paths under `~` (home), `/tmp`, and `/private/tmp` are permitted.
|
||||
- **File size cap**: 10 MB max per read.
|
||||
- **No arbitrary commands**: Only explicit tool operations; no shell execution.
|
||||
- **Smart home requires OpenHue CLI**: Light control goes through the official Hue CLI which handles bridge authentication.
|
||||
- **Graceful degradation**: If `psutil` is missing, `system_info` returns basic platform data; if `openhue` is missing, light tools return install instructions.
|
||||
|
||||
## Runtime Configuration
|
||||
|
||||
Edit `~/.timmy/hardware/hardware_mcp_config.yaml` (copy from `timmy-local/hardware_mcp_config.yaml`) to adjust:
|
||||
|
||||
```yaml
|
||||
guards:
|
||||
max_consecutive_errors: 3
|
||||
max_mcp_calls_per_session: 0 # 0 = unlimited
|
||||
allowed_dirs:
|
||||
- "~"
|
||||
- "/tmp"
|
||||
- "/private/tmp"
|
||||
max_file_size_bytes: 10485760 # 10 MB
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
```bash
|
||||
# Validate Python syntax
|
||||
python3 -m py_compile scripts/hardware_mcp_server.py
|
||||
|
||||
# Run smoke tests
|
||||
pytest tests/test_hardware_mcp_server.py -v
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**MCP tools not appearing in Hermes**
|
||||
|
||||
- Verify `mcp` Python package is installed: `pip show mcp`
|
||||
- Check `~/.hermes/config.yaml` syntax (YAML parse)
|
||||
- Restart Hermes (MCP connects at startup only)
|
||||
- Check Hermes logs: `~/.hermes/logs/` for MCP connection errors
|
||||
|
||||
**"openhue CLI not found"**
|
||||
|
||||
- Install OpenHue: `brew install openhue/cli/openhue`
|
||||
- First run requires pressing the Hue Bridge button to pair
|
||||
- Ensure bridge is on same local network
|
||||
|
||||
**"Path not allowed"**
|
||||
|
||||
- Only home (`~`), `/tmp`, and `/private/tmp` are accessible
|
||||
- Use absolute paths or `~/` expansion; relative paths are resolved from home
|
||||
|
||||
**File too large**
|
||||
|
||||
- Max read size is 10 MB. Split or compress large files.
|
||||
|
||||
## Dependencies
|
||||
|
||||
| Package | Purpose | Install |
|
||||
|---------|---------|---------|
|
||||
| `mcp` | MCP SDK (server framework) | `pip install mcp` |
|
||||
| `openhue` | Hue light control CLI | `brew install openhue/cli/openhue` |
|
||||
| `psutil` (optional) | Detailed memory/disk metrics | `pip install psutil` |
|
||||
|
||||
## Closes #466
|
||||
@@ -1,313 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Cross-agent quality audit — #518
|
||||
|
||||
Fetches all PRs across Timmy_Foundation repos, classifies by agent,
|
||||
and produces a merge-rate scorecard.
|
||||
|
||||
Usage:
|
||||
python scripts/cross_agent_quality_audit.py
|
||||
python scripts/cross_agent_quality_audit.py --scorecard timmy-config/agent-quality-scorecard.md
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
from collections import defaultdict
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
import requests
|
||||
|
||||
GITEA_BASE = "https://forge.alexanderwhitestone.com/api/v1"
|
||||
ORG = "Timmy_Foundation"
|
||||
TOKEN = os.environ.get("GITEA_TOKEN") or (
|
||||
Path.home() / ".config" / "gitea" / "token"
|
||||
).read_text().strip()
|
||||
|
||||
HEADERS = {"Authorization": f"token {TOKEN}"}
|
||||
|
||||
# Repos to audit (active code repos)
|
||||
DEFAULT_REPOS = [
|
||||
"timmy-home",
|
||||
"hermes-agent",
|
||||
"the-nexus",
|
||||
"the-door",
|
||||
"fleet-ops",
|
||||
"burn-fleet",
|
||||
"the-playground",
|
||||
"compounding-intelligence",
|
||||
"the-beacon",
|
||||
"second-son-of-timmy",
|
||||
"timmy-academy",
|
||||
"timmy-config",
|
||||
]
|
||||
|
||||
|
||||
class AgentClassifier:
|
||||
"""Classify PRs by agent identity."""
|
||||
|
||||
# PR title prefixes that explicitly name an agent
|
||||
AGENT_TITLE_RE = re.compile(
|
||||
r"^\[(?P<agent>Claude|Ezra|Allegro|Bezalel|Timmy|Gemini|Kimi|Manus|Codex)\]",
|
||||
re.IGNORECASE,
|
||||
)
|
||||
|
||||
# Branch patterns that embed agent names
|
||||
AGENT_BRANCH_RE = re.compile(
|
||||
r"(?P<agent>claude|ezra|allegro|bezalel|timmy|gemini|kimi|manus|codex)",
|
||||
re.IGNORECASE,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def classify(cls, pr: Dict[str, Any]) -> str:
|
||||
title = pr.get("title", "")
|
||||
branch = pr.get("head", {}).get("ref", "")
|
||||
user = pr.get("user", {}).get("login", "")
|
||||
|
||||
# 1. Explicit title tag like [Claude] or [Ezra]
|
||||
m = cls.AGENT_TITLE_RE.match(title)
|
||||
if m:
|
||||
return m.group("agent").lower()
|
||||
|
||||
# 2. Branch contains agent name (e.g. claude/issue-123)
|
||||
m = cls.AGENT_BRANCH_RE.search(branch)
|
||||
if m:
|
||||
return m.group("agent").lower()
|
||||
|
||||
# 3. Git user mapping
|
||||
if user.lower() == "claude":
|
||||
return "claude"
|
||||
if user.lower() == "rockachopa":
|
||||
# Rockachopa is the human / orchestrator — map to "burn-loop"
|
||||
return "burn-loop"
|
||||
|
||||
return "unknown"
|
||||
|
||||
|
||||
def fetch_prs(repo: str, state: str = "all", per_page: int = 50) -> List[Dict[str, Any]]:
|
||||
"""Paginate through all PRs for a repo."""
|
||||
prs: List[Dict[str, Any]] = []
|
||||
page = 1
|
||||
while True:
|
||||
url = f"{GITEA_BASE}/repos/{ORG}/{repo}/pulls?state={state}&limit={per_page}&page={page}"
|
||||
resp = requests.get(url, headers=HEADERS, timeout=30)
|
||||
resp.raise_for_status()
|
||||
batch = resp.json()
|
||||
if not batch:
|
||||
break
|
||||
prs.extend(batch)
|
||||
if len(batch) < per_page:
|
||||
break
|
||||
page += 1
|
||||
return prs
|
||||
|
||||
|
||||
def parse_datetime(dt_str: Optional[str]) -> Optional[datetime]:
|
||||
if not dt_str:
|
||||
return None
|
||||
try:
|
||||
return datetime.fromisoformat(dt_str.replace("Z", "+00:00"))
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
|
||||
def hours_between(start: Optional[str], end: Optional[str]) -> Optional[float]:
|
||||
s = parse_datetime(start)
|
||||
e = parse_datetime(end)
|
||||
if s and e:
|
||||
return (e - s).total_seconds() / 3600
|
||||
return None
|
||||
|
||||
|
||||
def audit_repos(repos: List[str]) -> Dict[str, Any]:
|
||||
"""Run the audit and return aggregated stats."""
|
||||
agent_stats: Dict[str, Dict[str, Any]] = defaultdict(
|
||||
lambda: {
|
||||
"total": 0,
|
||||
"merged": 0,
|
||||
"closed_unmerged": 0,
|
||||
"open": 0,
|
||||
"hours_to_merge": [],
|
||||
"hours_to_close": [],
|
||||
"repos": set(),
|
||||
"prs": [],
|
||||
}
|
||||
)
|
||||
|
||||
repo_stats: Dict[str, Dict[str, Any]] = {}
|
||||
|
||||
for repo in repos:
|
||||
print(f"Fetching PRs for {repo} ...", file=sys.stderr)
|
||||
try:
|
||||
prs = fetch_prs(repo)
|
||||
except requests.HTTPError as exc:
|
||||
print(f" SKIP {repo}: {exc}", file=sys.stderr)
|
||||
continue
|
||||
|
||||
repo_merged = 0
|
||||
repo_total = len(prs)
|
||||
for pr in prs:
|
||||
agent = AgentClassifier.classify(pr)
|
||||
s = agent_stats[agent]
|
||||
s["total"] += 1
|
||||
s["repos"].add(repo)
|
||||
s["prs"].append(
|
||||
{
|
||||
"repo": repo,
|
||||
"number": pr["number"],
|
||||
"title": pr["title"],
|
||||
"state": pr["state"],
|
||||
"merged": pr.get("merged", False),
|
||||
"created_at": pr.get("created_at"),
|
||||
"merged_at": pr.get("merged_at"),
|
||||
"closed_at": pr.get("closed_at"),
|
||||
}
|
||||
)
|
||||
|
||||
if pr.get("merged"):
|
||||
s["merged"] += 1
|
||||
repo_merged += 1
|
||||
h = hours_between(pr.get("created_at"), pr.get("merged_at"))
|
||||
if h is not None:
|
||||
s["hours_to_merge"].append(h)
|
||||
elif pr["state"] == "closed":
|
||||
s["closed_unmerged"] += 1
|
||||
h = hours_between(pr.get("created_at"), pr.get("closed_at"))
|
||||
if h is not None:
|
||||
s["hours_to_close"].append(h)
|
||||
else:
|
||||
s["open"] += 1
|
||||
|
||||
repo_stats[repo] = {
|
||||
"total": repo_total,
|
||||
"merged": repo_merged,
|
||||
"merge_rate": round(repo_merged / repo_total, 2) if repo_total else 0,
|
||||
}
|
||||
|
||||
# Compute derived metrics
|
||||
summary = {}
|
||||
for agent, s in sorted(agent_stats.items(), key=lambda x: -x[1]["total"]):
|
||||
total = s["total"]
|
||||
merged = s["merged"]
|
||||
closed = s["closed_unmerged"]
|
||||
resolved = merged + closed
|
||||
merge_rate = round(merged / resolved, 3) if resolved else 0
|
||||
avg_merge_hours = (
|
||||
round(sum(s["hours_to_merge"]) / len(s["hours_to_merge"]), 1)
|
||||
if s["hours_to_merge"]
|
||||
else None
|
||||
)
|
||||
avg_close_hours = (
|
||||
round(sum(s["hours_to_close"]) / len(s["hours_to_close"]), 1)
|
||||
if s["hours_to_close"]
|
||||
else None
|
||||
)
|
||||
summary[agent] = {
|
||||
"total_prs": total,
|
||||
"merged": merged,
|
||||
"closed_unmerged": closed,
|
||||
"open": s["open"],
|
||||
"merge_rate": merge_rate,
|
||||
"rejection_rate": round(closed / resolved, 3) if resolved else 0,
|
||||
"avg_hours_to_merge": avg_merge_hours,
|
||||
"avg_hours_to_close": avg_close_hours,
|
||||
"repos": sorted(s["repos"]),
|
||||
}
|
||||
|
||||
return {
|
||||
"audited_at": datetime.now(timezone.utc).isoformat(),
|
||||
"repos_audited": repos,
|
||||
"repo_stats": repo_stats,
|
||||
"agent_summary": summary,
|
||||
"raw_prs": {a: s["prs"] for a, s in agent_stats.items()},
|
||||
}
|
||||
|
||||
|
||||
def render_scorecard(data: Dict[str, Any]) -> str:
|
||||
"""Render a markdown scorecard."""
|
||||
lines = [
|
||||
"# Cross-Agent Quality Scorecard",
|
||||
"",
|
||||
f"**Audited at:** {data['audited_at']}",
|
||||
f"**Repos audited:** {', '.join(data['repos_audited'])}",
|
||||
"",
|
||||
"## Per-Agent Summary",
|
||||
"",
|
||||
"| Agent | Total PRs | Merged | Closed (unmerged) | Open | Merge Rate | Rejection Rate | Avg Hours to Merge | Avg Hours to Close |",
|
||||
"|---|---|---:|---:|---:|---:|---:|---:|---:|",
|
||||
]
|
||||
|
||||
for agent, s in data["agent_summary"].items():
|
||||
merge_hours = f"{s['avg_hours_to_merge']:.1f}" if s["avg_hours_to_merge"] is not None else "—"
|
||||
close_hours = f"{s['avg_hours_to_close']:.1f}" if s["avg_hours_to_close"] is not None else "—"
|
||||
lines.append(
|
||||
f"| {agent} | {s['total_prs']} | {s['merged']} | {s['closed_unmerged']} | "
|
||||
f"{s['open']} | {s['merge_rate']:.1%} | {s['rejection_rate']:.1%} | "
|
||||
f"{merge_hours} | {close_hours} |"
|
||||
)
|
||||
|
||||
lines.extend([
|
||||
"",
|
||||
"## Per-Repo Merge Rate",
|
||||
"",
|
||||
"| Repo | Total PRs | Merged | Merge Rate |",
|
||||
"|---|---|---:|---:|",
|
||||
])
|
||||
|
||||
for repo, s in sorted(data["repo_stats"].items(), key=lambda x: -x[1]["total"]):
|
||||
lines.append(
|
||||
f"| {repo} | {s['total']} | {s['merged']} | {s['merge_rate']:.1%} |"
|
||||
)
|
||||
|
||||
lines.extend([
|
||||
"",
|
||||
"## Methodology",
|
||||
"",
|
||||
"- **Agent classification** uses three signals in priority order:",
|
||||
" 1. Explicit title tag (e.g. `[Claude]`, `[Ezra]`)",
|
||||
" 2. Branch name containing agent name (e.g. `claude/issue-123`)",
|
||||
" 3. Git user (`claude` → claude, `Rockachopa` → burn-loop)",
|
||||
"- **Merge rate** = merged / (merged + closed_unmerged). Open PRs are excluded.",
|
||||
"- **Rejection rate** = closed_unmerged / (merged + closed_unmerged).",
|
||||
"- **Time metrics** are computed from created_at to merged_at / closed_at.",
|
||||
"",
|
||||
"## Raw Data",
|
||||
"",
|
||||
"```json",
|
||||
json.dumps(data["agent_summary"], indent=2),
|
||||
"```",
|
||||
"",
|
||||
])
|
||||
|
||||
return "\n".join(lines) + "\n"
|
||||
|
||||
|
||||
def main() -> int:
|
||||
parser = argparse.ArgumentParser(description="Cross-agent quality audit")
|
||||
parser.add_argument("--repos", nargs="+", default=DEFAULT_REPOS, help="Repos to audit")
|
||||
parser.add_argument("--scorecard", default="timmy-config/agent-quality-scorecard.md", help="Output path")
|
||||
parser.add_argument("--json", default=None, help="Also write raw JSON to path")
|
||||
args = parser.parse_args()
|
||||
|
||||
data = audit_repos(args.repos)
|
||||
|
||||
scorecard_path = Path(args.scorecard)
|
||||
scorecard_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
scorecard_path.write_text(render_scorecard(data))
|
||||
print(f"Scorecard written to {scorecard_path}", file=sys.stderr)
|
||||
|
||||
if args.json:
|
||||
json_path = Path(args.json)
|
||||
json_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
json_path.write_text(json.dumps(data, indent=2, default=str))
|
||||
print(f"Raw JSON written to {json_path}", file=sys.stderr)
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
raise SystemExit(main())
|
||||
56
scripts/hardware_mcp_integration.py
Normal file
56
scripts/hardware_mcp_integration.py
Normal file
@@ -0,0 +1,56 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Local Hardware MCP operator helper — generate config snippets and verify environment."""
|
||||
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
REPO_ROOT = Path(__file__).resolve().parents[1]
|
||||
HERMES_CONFIG = Path.home() / ".hermes" / "config.yaml"
|
||||
HARDWARE_MCP_CONFIG = Path.home() / ".timmy" / "hardware" / "hardware_mcp_config.yaml"
|
||||
HARDWARE_SERVER = REPO_ROOT / "scripts" / "hardware_mcp_server.py"
|
||||
|
||||
|
||||
def build_mcp_config_snippet() -> str:
|
||||
"""Return the mcp_servers YAML snippet for ~/.hermes/config.yaml."""
|
||||
return f"""mcp_servers:
|
||||
hardware:
|
||||
command: "python"
|
||||
args: ["{HARDWARE_SERVER}"]
|
||||
"""
|
||||
|
||||
|
||||
def build_wakeup_hook() -> str:
|
||||
"""Return a bash snippet that can be sourced before Hermes starts (optional)."""
|
||||
return f"""#!/usr/bin/env bash
|
||||
# Hardware MCP environment check
|
||||
if command -v openhue >/dev/null 2>&1; then
|
||||
echo "[Hardware MCP] OpenHue found: $(openhue version)"
|
||||
else
|
||||
echo "[Hardware MCP] Warning: openhue CLI not installed — light control disabled"
|
||||
fi
|
||||
"""
|
||||
|
||||
|
||||
def main():
|
||||
import argparse
|
||||
p = argparse.ArgumentParser(description="Hardware MCP integration helper")
|
||||
p.add_argument("--print-config", action="store_true", help="Print mcp_servers YAML snippet")
|
||||
p.add_argument("--print-hook", action="store_true", help="Print optional session-start hook")
|
||||
p.add_argument("--verify", action="store_true", help="Verify server script exists and is executable")
|
||||
args = p.parse_args()
|
||||
|
||||
if args.print_config:
|
||||
print(build_mcp_config_snippet())
|
||||
elif args.print_hook:
|
||||
print(build_wakeup_hook())
|
||||
elif args.verify:
|
||||
ok = HARDWARE_SERVER.exists()
|
||||
print(f"Server script: {'OK' if ok else 'MISSING'} at {HARDWARE_SERVER}")
|
||||
sys.exit(0 if ok else 1)
|
||||
else:
|
||||
p.print_help()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
206
scripts/hardware_mcp_server.py
Normal file
206
scripts/hardware_mcp_server.py
Normal file
@@ -0,0 +1,206 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Local Hardware MCP Server — Secure control of local hardware.
|
||||
|
||||
Exposes tools for:
|
||||
- File system operations (read, write, list) within allowed directories
|
||||
- Smart home control via OpenHue (Philips Hue lights)
|
||||
- System information (safe, read-only)
|
||||
|
||||
Security: Enforces directory allowlist for file access.
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
import subprocess
|
||||
import tempfile
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from mcp.server import Server
|
||||
from mcp.server.stdio import stdio_server
|
||||
from mcp.types import Tool, TextContent
|
||||
|
||||
ALLOWED_DIRS = [
|
||||
str(Path.home()), # User home directory
|
||||
"/tmp", # macOS symlink to /private/tmp
|
||||
"/private/tmp", # real tmp path
|
||||
str(Path(tempfile.gettempdir())), # actual system temp dir
|
||||
]
|
||||
OPENHUE_CMD = "openhue"
|
||||
MAX_FILE_SIZE = 10 * 1024 * 1024
|
||||
app = Server("hardware")
|
||||
|
||||
|
||||
def is_path_allowed(path: Path) -> bool:
|
||||
try:
|
||||
resolved = path.resolve()
|
||||
return any(resolved.is_relative_to(Path(d).resolve()) for d in ALLOWED_DIRS)
|
||||
except (ValueError, OSError):
|
||||
return False
|
||||
|
||||
|
||||
def run_openhue(args: list[str]) -> dict[str, Any]:
|
||||
try:
|
||||
result = subprocess.run([OPENHUE_CMD] + args, capture_output=True, text=True, timeout=30)
|
||||
return {
|
||||
"success": result.returncode == 0,
|
||||
"stdout": result.stdout.strip(),
|
||||
"stderr": result.stderr.strip(),
|
||||
"returncode": result.returncode,
|
||||
}
|
||||
except FileNotFoundError:
|
||||
return {"success": False,
|
||||
"error": "openhue CLI not found. Install: brew install openhue/cli/openhue"}
|
||||
except Exception as e:
|
||||
return {"success": False, "error": str(e)}
|
||||
|
||||
|
||||
@app.list_tools()
|
||||
async def list_tools():
|
||||
return [
|
||||
Tool(name="file_read",
|
||||
description="Read a file from allowed directories (home, /tmp) up to 10 MB.",
|
||||
inputSchema={"type": "object", "properties": {"path": {"type": "string",
|
||||
"description": "File path to read (e.g., ~/notes.txt)"}}, "required": ["path"]}),
|
||||
Tool(name="file_write",
|
||||
description="Write text content to a file within allowed directories.",
|
||||
inputSchema={"type": "object", "properties": {"path": {"type": "string"},
|
||||
"content": {"type": "string"}}, "required": ["path", "content"]}),
|
||||
Tool(name="file_list",
|
||||
description="List files and directories in a given folder.",
|
||||
inputSchema={"type": "object", "properties": {"directory": {"type": "string", "default": "~"}}, "required": []}),
|
||||
Tool(name="light_list",
|
||||
description="List all Hue lights, rooms, and scenes.",
|
||||
inputSchema={"type": "object", "properties": {}, "required": []}),
|
||||
Tool(name="light_control",
|
||||
description="Control a Hue light: on/off, brightness 0-100, color name/hex, temperature 153-500 mirek.",
|
||||
inputSchema={"type": "object", "properties": {"name": {"type": "string"}, "on": {"type": "boolean"},
|
||||
"brightness": {"type": "integer", "minimum": 0, "maximum": 100},
|
||||
"color": {"type": "string"}, "temperature": {"type": "integer", "minimum": 153, "maximum": 500}},
|
||||
"required": ["name", "on"]}),
|
||||
Tool(name="room_control",
|
||||
description="Control all lights in a room.",
|
||||
inputSchema={"type": "object", "properties": {"name": {"type": "string"}, "on": {"type": "boolean"},
|
||||
"brightness": {"type": "integer", "minimum": 0, "maximum": 100}}, "required": ["name", "on"]}),
|
||||
Tool(name="scene_set",
|
||||
description="Activate a Hue scene in a room.",
|
||||
inputSchema={"type": "object", "properties": {"scene": {"type": "string"}, "room": {"type": "string"}}, "required": ["scene", "room"]}),
|
||||
Tool(name="system_info",
|
||||
description="Get safe system info: OS, CPU count, memory, disk usage.",
|
||||
inputSchema={"type": "object", "properties": {}, "required": []}),
|
||||
]
|
||||
|
||||
|
||||
@app.call_tool()
|
||||
async def call_tool(name: str, arguments: dict):
|
||||
if name == "file_read":
|
||||
path = Path(arguments["path"].strip()).expanduser()
|
||||
if not is_path_allowed(path):
|
||||
return [TextContent(type="text", text=json.dumps({"error": f"Path not allowed: {path}"}))]
|
||||
if not path.is_file():
|
||||
return [TextContent(type="text", text=json.dumps({"error": f"File not found: {path}"}))]
|
||||
try:
|
||||
size = path.stat().st_size
|
||||
if size > MAX_FILE_SIZE:
|
||||
return [TextContent(type="text", text=json.dumps({"error": f"File too large: {size} bytes"}))]
|
||||
content = path.read_text()
|
||||
return [TextContent(type="text", text=json.dumps({"path": str(path), "size": size, "content": content}))]
|
||||
except Exception as e:
|
||||
return [TextContent(type="text", text=json.dumps({"error": str(e)}))]
|
||||
|
||||
elif name == "file_write":
|
||||
path = Path(arguments["path"].strip()).expanduser()
|
||||
if not is_path_allowed(path):
|
||||
return [TextContent(type="text", text=json.dumps({"error": f"Path not allowed: {path}"}))]
|
||||
try:
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
path.write_text(arguments["content"])
|
||||
return [TextContent(type="text", text=json.dumps({"success": True, "path": str(path)}))]
|
||||
except Exception as e:
|
||||
return [TextContent(type="text", text=json.dumps({"error": str(e)}))]
|
||||
|
||||
elif name == "file_list":
|
||||
directory = Path(arguments.get("directory", "~").strip()).expanduser()
|
||||
if not is_path_allowed(directory):
|
||||
return [TextContent(type="text", text=json.dumps({"error": f"Directory not allowed: {directory}"}))]
|
||||
if not directory.is_dir():
|
||||
return [TextContent(type="text", text=json.dumps({"error": f"Not a directory: {directory}"}))]
|
||||
try:
|
||||
entries = []
|
||||
for entry in sorted(directory.iterdir()):
|
||||
try:
|
||||
stat = entry.stat()
|
||||
entries.append({"name": entry.name, "is_dir": entry.is_dir(),
|
||||
"size": stat.st_size if entry.is_file() else None})
|
||||
except (OSError, PermissionError):
|
||||
pass
|
||||
return [TextContent(type="text", text=json.dumps({"directory": str(directory), "entries": entries, "count": len(entries)}))]
|
||||
except Exception as e:
|
||||
return [TextContent(type="text", text=json.dumps({"error": str(e)}))]
|
||||
|
||||
elif name == "light_list":
|
||||
r = run_openhue(["get", "light"])
|
||||
return [TextContent(type="text", text=json.dumps(r))]
|
||||
|
||||
elif name == "light_control":
|
||||
args = ["set", "light", f'"{arguments["name"]}"']
|
||||
if arguments.get("on") is not None:
|
||||
args.append("--on" if arguments["on"] else "--off")
|
||||
if brightness := arguments.get("brightness"):
|
||||
args.append(f"--brightness {brightness}")
|
||||
if color := arguments.get("color"):
|
||||
args.append(f"--color {color}")
|
||||
if temperature := arguments.get("temperature"):
|
||||
args.append(f"--temperature {temperature}")
|
||||
return [TextContent(type="text", text=json.dumps(run_openhue(args)))]
|
||||
|
||||
elif name == "room_control":
|
||||
args = ["set", "room", f'"{arguments["name"]}"']
|
||||
if arguments.get("on") is not None:
|
||||
args.append("--on" if arguments["on"] else "--off")
|
||||
if brightness := arguments.get("brightness"):
|
||||
args.append(f"--brightness {brightness}")
|
||||
return [TextContent(type="text", text=json.dumps(run_openhue(args)))]
|
||||
|
||||
elif name == "scene_set":
|
||||
args = ["set", "scene", arguments["scene"], "--room", arguments["room"]]
|
||||
return [TextContent(type="text", text=json.dumps(run_openhue(args)))]
|
||||
|
||||
elif name == "system_info":
|
||||
try:
|
||||
import platform
|
||||
info = {"platform": platform.system(), "release": platform.release(),
|
||||
"arch": platform.machine(), "hostname": platform.node(),
|
||||
"cpu_count": os.cpu_count()}
|
||||
try:
|
||||
import psutil
|
||||
mem = psutil.virtual_memory()
|
||||
info["memory_gb"] = round(mem.total / (1024**3), 2)
|
||||
disk = psutil.disk_usage(str(Path.home()))
|
||||
info["disk_home_gb"] = round(disk.total / (1024**3), 2)
|
||||
except ImportError:
|
||||
info["memory_gb"] = "psutil not installed"
|
||||
info["disk_home_gb"] = "psutil not installed"
|
||||
return [TextContent(type="text", text=json.dumps(info, indent=2))]
|
||||
except Exception as e:
|
||||
return [TextContent(type="text", text=json.dumps({"error": str(e)}))]
|
||||
|
||||
else:
|
||||
return [TextContent(type="text", text=json.dumps({
|
||||
"error": f"Unknown tool: {name}",
|
||||
"available": ["file_read", "file_write", "file_list", "light_list",
|
||||
"light_control", "room_control", "scene_set", "system_info"],
|
||||
}))]
|
||||
|
||||
|
||||
async def main():
|
||||
async with stdio_server() as (rs, ws):
|
||||
await app.run(rs, ws, app.create_initialization_options())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import asyncio
|
||||
asyncio.run(main())
|
||||
|
||||
@@ -1,45 +0,0 @@
|
||||
"""Tests for cross_agent_quality_audit.py — #518."""
|
||||
|
||||
import pytest
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent / "scripts"))
|
||||
|
||||
from cross_agent_quality_audit import AgentClassifier, hours_between
|
||||
|
||||
|
||||
class TestAgentClassifier:
|
||||
def test_title_tag_claude(self):
|
||||
pr = {"title": "[Claude] fix auth middleware", "head": {"ref": "fix/123"}, "user": {"login": "rockachopa"}}
|
||||
assert AgentClassifier.classify(pr) == "claude"
|
||||
|
||||
def test_title_tag_ezra(self):
|
||||
pr = {"title": "[Ezra] tmux fleet launcher", "head": {"ref": "burn/10"}, "user": {"login": "rockachopa"}}
|
||||
assert AgentClassifier.classify(pr) == "ezra"
|
||||
|
||||
def test_branch_name_claude(self):
|
||||
pr = {"title": "fix auth", "head": {"ref": "claude/issue-1695"}, "user": {"login": "rockachopa"}}
|
||||
assert AgentClassifier.classify(pr) == "claude"
|
||||
|
||||
def test_user_mapping(self):
|
||||
pr = {"title": "some fix", "head": {"ref": "fix/1"}, "user": {"login": "claude"}}
|
||||
assert AgentClassifier.classify(pr) == "claude"
|
||||
|
||||
def test_rockachopa_maps_to_burn_loop(self):
|
||||
pr = {"title": "some fix", "head": {"ref": "fix/1"}, "user": {"login": "Rockachopa"}}
|
||||
assert AgentClassifier.classify(pr) == "burn-loop"
|
||||
|
||||
def test_unknown_fallback(self):
|
||||
pr = {"title": "some fix", "head": {"ref": "fix/1"}, "user": {"login": "random"}}
|
||||
assert AgentClassifier.classify(pr) == "unknown"
|
||||
|
||||
|
||||
class TestHoursBetween:
|
||||
def test_same_day(self):
|
||||
h = hours_between("2026-04-22T10:00:00Z", "2026-04-22T12:00:00Z")
|
||||
assert h == 2.0
|
||||
|
||||
def test_none_returns_none(self):
|
||||
assert hours_between(None, "2026-04-22T12:00:00Z") is None
|
||||
assert hours_between("2026-04-22T10:00:00Z", None) is None
|
||||
51
tests/test_hardware_mcp_functional.py
Normal file
51
tests/test_hardware_mcp_functional.py
Normal file
@@ -0,0 +1,51 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Functional test for hardware_mcp_server — uses asyncio.get_event_loop for restricted envs."""
|
||||
|
||||
import asyncio, json, tempfile, sys
|
||||
from pathlib import Path
|
||||
sys.path.insert(0, str(Path(__file__).resolve().parent.parent))
|
||||
from scripts.hardware_mcp_server import call_tool, is_path_allowed
|
||||
|
||||
async def run_tests():
|
||||
# Path allowlist
|
||||
assert is_path_allowed(Path.home() / "any.txt")
|
||||
assert is_path_allowed(Path("/tmp/foo"))
|
||||
assert not is_path_allowed(Path("/etc/passwd"))
|
||||
print("✓ Path allowlist")
|
||||
|
||||
# file_list on home
|
||||
res = await call_tool("file_list", {"directory": "~"})
|
||||
data = json.loads(res[0].text)
|
||||
assert "entries" in data and data["count"] >= 0
|
||||
print(f"✓ file_list works, entries: {data['count']}")
|
||||
|
||||
# file_write + file_read round-trip in temp dir
|
||||
with tempfile.TemporaryDirectory() as td:
|
||||
fp = Path(td) / "hmcp_test.txt"
|
||||
content = "Hardware MCP round-trip OK"
|
||||
w = await call_tool("file_write", {"path": str(fp), "content": content})
|
||||
assert json.loads(w[0].text).get("success")
|
||||
r = await call_tool("file_read", {"path": str(fp)})
|
||||
assert json.loads(r[0].text)["content"] == content
|
||||
print("✓ file write/read round-trip")
|
||||
|
||||
# file_read error: missing file
|
||||
err = await call_tool("file_read", {"path": str(Path.home() / "no_such_file_xyz")})
|
||||
assert "error" in json.loads(err[0].text)
|
||||
print("✓ file_read reports missing file")
|
||||
|
||||
# Security: path traversal blocked
|
||||
block = await call_tool("file_read", {"path": "/etc/passwd"})
|
||||
bd = json.loads(block[0].text)
|
||||
assert "not allowed" in bd.get("error", "").lower()
|
||||
print("✓ Path traversal blocked")
|
||||
|
||||
print("\nAll functional checks passed!")
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Use get_event_loop for environments where asyncio.run is disabled
|
||||
try:
|
||||
asyncio.run(run_tests())
|
||||
except RuntimeError:
|
||||
loop = asyncio.get_event_loop()
|
||||
loop.run_until_complete(run_tests())
|
||||
126
tests/test_hardware_mcp_server.py
Normal file
126
tests/test_hardware_mcp_server.py
Normal file
@@ -0,0 +1,126 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Smoke tests for hardware_mcp_server."""
|
||||
|
||||
import json
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
from unittest import TestCase
|
||||
|
||||
# Add repo root to path
|
||||
ROOT = Path(__file__).resolve().parent.parent
|
||||
sys.path.insert(0, str(ROOT))
|
||||
|
||||
|
||||
class TestHardwareMCPToolDefinitions(TestCase):
|
||||
"""Verify the MCP server is well-formed and tools have required schemas."""
|
||||
|
||||
def test_server_imports(self):
|
||||
"""Server module must import cleanly."""
|
||||
import importlib.util
|
||||
spec = importlib.util.spec_from_file_location(
|
||||
"hardware_mcp_server",
|
||||
ROOT / "scripts" / "hardware_mcp_server.py"
|
||||
)
|
||||
self.assertIsNotNone(spec)
|
||||
mod = importlib.util.module_from_spec(spec)
|
||||
spec.loader.exec_module(mod)
|
||||
self.assertTrue(hasattr(mod, "app"))
|
||||
self.assertTrue(hasattr(mod, "list_tools"))
|
||||
self.assertTrue(hasattr(mod, "call_tool"))
|
||||
|
||||
def test_list_tools_returns_at_least_five_tools(self):
|
||||
"""list_tools() must return multiple tools covering file ops, lights, and system info."""
|
||||
import asyncio
|
||||
from scripts.hardware_mcp_server import list_tools
|
||||
tools = asyncio.run(list_tools())
|
||||
tool_names = [t.name for t in tools]
|
||||
# Core capabilities
|
||||
self.assertIn("file_read", tool_names)
|
||||
self.assertIn("file_write", tool_names)
|
||||
self.assertIn("file_list", tool_names)
|
||||
self.assertIn("light_list", tool_names)
|
||||
self.assertIn("light_control", tool_names)
|
||||
self.assertIn("room_control", tool_names)
|
||||
self.assertIn("scene_set", tool_names)
|
||||
self.assertIn("system_info", tool_names)
|
||||
self.assertGreaterEqual(len(tools), 8)
|
||||
|
||||
def test_file_read_schema_requires_path(self):
|
||||
"""file_read tool must require 'path' parameter."""
|
||||
import asyncio
|
||||
from scripts.hardware_mcp_server import list_tools
|
||||
tools = asyncio.run(list_tools())
|
||||
ft = next(t for t in tools if t.name == "file_read")
|
||||
self.assertIn("path", ft.inputSchema["properties"])
|
||||
self.assertIn("path", ft.inputSchema["required"])
|
||||
|
||||
def test_light_control_schema_requires_name_and_on(self):
|
||||
"""light_control requires name and on."""
|
||||
import asyncio
|
||||
from scripts.hardware_mcp_server import list_tools
|
||||
tools = asyncio.run(list_tools())
|
||||
ft = next(t for t in tools if t.name == "light_control")
|
||||
self.assertIn("name", ft.inputSchema["required"])
|
||||
self.assertIn("on", ft.inputSchema["required"])
|
||||
|
||||
def test_system_info_is_readonly(self):
|
||||
"""system_info tool takes no arguments."""
|
||||
import asyncio
|
||||
from scripts.hardware_mcp_server import list_tools
|
||||
tools = asyncio.run(list_tools())
|
||||
ft = next(t for t in tools if t.name == "system_info")
|
||||
self.assertEqual(ft.inputSchema.get("required", []), [])
|
||||
self.assertEqual(len(ft.inputSchema.get("properties", {})), 0)
|
||||
|
||||
def test_file_write_path_allowed_check(self):
|
||||
"""File write must enforce path allowlist (regression guard)."""
|
||||
from scripts.hardware_mcp_server import is_path_allowed, Path
|
||||
self.assertTrue(is_path_allowed(Path.home() / "test.txt"))
|
||||
self.assertTrue(is_path_allowed(Path("/tmp/test.txt")))
|
||||
# Outside allowed dirs should be rejected
|
||||
self.assertFalse(is_path_allowed(Path("/etc/passwd")))
|
||||
|
||||
def test_run_openhue_error_handling(self):
|
||||
"""openhue runner returns structured error when CLI missing."""
|
||||
from scripts.hardware_mcp_server import run_openhue
|
||||
result = run_openhue(["get", "light"])
|
||||
# On a system without openhue, must return success=False with helpful error
|
||||
self.assertIn("success", result)
|
||||
if not result.get("success"):
|
||||
self.assertIn("error", result)
|
||||
self.assertIn("openhue", result.get("error", "").lower())
|
||||
|
||||
|
||||
class TestHardwareMCPConfigCompleteness(TestCase):
|
||||
"""Validate config template matches tool set."""
|
||||
|
||||
def test_config_template_exists(self):
|
||||
self.assertTrue((ROOT / "timmy-local" / "hardware_mcp_config.yaml").exists())
|
||||
|
||||
def test_config_lists_all_tools(self):
|
||||
with open(ROOT / "timmy-local" / "hardware_mcp_config.yaml") as f:
|
||||
content = f.read()
|
||||
# All tool names should appear in the tools: section
|
||||
for tool in ["file_read", "file_write", "file_list", "light_list",
|
||||
"light_control", "room_control", "scene_set", "system_info"]:
|
||||
self.assertIn(tool, content, f"Tool {tool} missing from config tools list")
|
||||
|
||||
def test_config_has_security_guards(self):
|
||||
with open(ROOT / "timmy-local" / "hardware_mcp_config.yaml") as f:
|
||||
content = f.read()
|
||||
self.assertIn("max_consecutive_errors", content)
|
||||
self.assertIn("allowed_dirs", content)
|
||||
self.assertIn("max_file_size_bytes", content)
|
||||
|
||||
def test_config_has_server_key(self):
|
||||
with open(ROOT / "timmy-local" / "hardware_mcp_config.yaml") as f:
|
||||
content = f.read()
|
||||
self.assertIn("server_key: hardware", content)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import unittest
|
||||
unittest.main()
|
||||
@@ -1,244 +0,0 @@
|
||||
# Cross-Agent Quality Scorecard
|
||||
|
||||
**Audited at:** 2026-04-22T06:17:43.574309+00:00
|
||||
**Repos audited:** timmy-home, hermes-agent, the-nexus, the-door, fleet-ops, burn-fleet, the-playground, compounding-intelligence, the-beacon, second-son-of-timmy, timmy-academy, timmy-config
|
||||
|
||||
## Per-Agent Summary
|
||||
|
||||
| Agent | Total PRs | Merged | Closed (unmerged) | Open | Merge Rate | Rejection Rate | Avg Hours to Merge | Avg Hours to Close |
|
||||
|---|---|---:|---:|---:|---:|---:|---:|---:|
|
||||
| burn-loop | 1733 | 346 | 1239 | 148 | 21.8% | 78.2% | 18.9 | 20.6 |
|
||||
| unknown | 843 | 598 | 214 | 31 | 73.6% | 26.4% | 2.3 | 11.3 |
|
||||
| claude | 264 | 138 | 121 | 5 | 53.3% | 46.7% | 3.3 | 6.2 |
|
||||
| gemini | 95 | 24 | 70 | 1 | 25.5% | 74.5% | 0.5 | 11.3 |
|
||||
| timmy | 28 | 15 | 11 | 2 | 57.7% | 42.3% | 9.8 | 20.2 |
|
||||
| bezalel | 21 | 11 | 9 | 1 | 55.0% | 45.0% | 2.7 | 8.0 |
|
||||
| allegro | 21 | 7 | 11 | 3 | 38.9% | 61.1% | 31.1 | 20.2 |
|
||||
| ezra | 8 | 2 | 3 | 3 | 40.0% | 60.0% | 4.4 | 16.8 |
|
||||
| kimi | 6 | 3 | 3 | 0 | 50.0% | 50.0% | 39.5 | 0.5 |
|
||||
| manus | 6 | 5 | 1 | 0 | 83.3% | 16.7% | 0.0 | 18.8 |
|
||||
| codex | 2 | 2 | 0 | 0 | 100.0% | 0.0% | 2.3 | — |
|
||||
|
||||
## Per-Repo Merge Rate
|
||||
|
||||
| Repo | Total PRs | Merged | Merge Rate |
|
||||
|---|---|---:|---:|
|
||||
| the-nexus | 985 | 501 | 51.0% |
|
||||
| hermes-agent | 519 | 128 | 25.0% |
|
||||
| timmy-config | 404 | 140 | 35.0% |
|
||||
| timmy-home | 270 | 104 | 39.0% |
|
||||
| fleet-ops | 266 | 84 | 32.0% |
|
||||
| the-beacon | 175 | 62 | 35.0% |
|
||||
| the-door | 153 | 31 | 20.0% |
|
||||
| second-son-of-timmy | 111 | 82 | 74.0% |
|
||||
| compounding-intelligence | 50 | 9 | 18.0% |
|
||||
| the-playground | 44 | 2 | 5.0% |
|
||||
| burn-fleet | 38 | 2 | 5.0% |
|
||||
| timmy-academy | 12 | 6 | 50.0% |
|
||||
|
||||
## Methodology
|
||||
|
||||
- **Agent classification** uses three signals in priority order:
|
||||
1. Explicit title tag (e.g. `[Claude]`, `[Ezra]`)
|
||||
2. Branch name containing agent name (e.g. `claude/issue-123`)
|
||||
3. Git user (`claude` → claude, `Rockachopa` → burn-loop)
|
||||
- **Merge rate** = merged / (merged + closed_unmerged). Open PRs are excluded.
|
||||
- **Rejection rate** = closed_unmerged / (merged + closed_unmerged).
|
||||
- **Time metrics** are computed from created_at to merged_at / closed_at.
|
||||
|
||||
## Raw Data
|
||||
|
||||
```json
|
||||
{
|
||||
"burn-loop": {
|
||||
"total_prs": 1733,
|
||||
"merged": 346,
|
||||
"closed_unmerged": 1239,
|
||||
"open": 148,
|
||||
"merge_rate": 0.218,
|
||||
"rejection_rate": 0.782,
|
||||
"avg_hours_to_merge": 18.9,
|
||||
"avg_hours_to_close": 20.6,
|
||||
"repos": [
|
||||
"burn-fleet",
|
||||
"compounding-intelligence",
|
||||
"fleet-ops",
|
||||
"hermes-agent",
|
||||
"second-son-of-timmy",
|
||||
"the-beacon",
|
||||
"the-door",
|
||||
"the-nexus",
|
||||
"the-playground",
|
||||
"timmy-academy",
|
||||
"timmy-config",
|
||||
"timmy-home"
|
||||
]
|
||||
},
|
||||
"unknown": {
|
||||
"total_prs": 843,
|
||||
"merged": 598,
|
||||
"closed_unmerged": 214,
|
||||
"open": 31,
|
||||
"merge_rate": 0.736,
|
||||
"rejection_rate": 0.264,
|
||||
"avg_hours_to_merge": 2.3,
|
||||
"avg_hours_to_close": 11.3,
|
||||
"repos": [
|
||||
"fleet-ops",
|
||||
"hermes-agent",
|
||||
"second-son-of-timmy",
|
||||
"the-beacon",
|
||||
"the-door",
|
||||
"the-nexus",
|
||||
"timmy-academy",
|
||||
"timmy-config",
|
||||
"timmy-home"
|
||||
]
|
||||
},
|
||||
"claude": {
|
||||
"total_prs": 264,
|
||||
"merged": 138,
|
||||
"closed_unmerged": 121,
|
||||
"open": 5,
|
||||
"merge_rate": 0.533,
|
||||
"rejection_rate": 0.467,
|
||||
"avg_hours_to_merge": 3.3,
|
||||
"avg_hours_to_close": 6.2,
|
||||
"repos": [
|
||||
"hermes-agent",
|
||||
"the-nexus",
|
||||
"timmy-config",
|
||||
"timmy-home"
|
||||
]
|
||||
},
|
||||
"gemini": {
|
||||
"total_prs": 95,
|
||||
"merged": 24,
|
||||
"closed_unmerged": 70,
|
||||
"open": 1,
|
||||
"merge_rate": 0.255,
|
||||
"rejection_rate": 0.745,
|
||||
"avg_hours_to_merge": 0.5,
|
||||
"avg_hours_to_close": 11.3,
|
||||
"repos": [
|
||||
"hermes-agent",
|
||||
"the-nexus",
|
||||
"timmy-config",
|
||||
"timmy-home"
|
||||
]
|
||||
},
|
||||
"timmy": {
|
||||
"total_prs": 28,
|
||||
"merged": 15,
|
||||
"closed_unmerged": 11,
|
||||
"open": 2,
|
||||
"merge_rate": 0.577,
|
||||
"rejection_rate": 0.423,
|
||||
"avg_hours_to_merge": 9.8,
|
||||
"avg_hours_to_close": 20.2,
|
||||
"repos": [
|
||||
"burn-fleet",
|
||||
"hermes-agent",
|
||||
"the-nexus",
|
||||
"timmy-config",
|
||||
"timmy-home"
|
||||
]
|
||||
},
|
||||
"bezalel": {
|
||||
"total_prs": 21,
|
||||
"merged": 11,
|
||||
"closed_unmerged": 9,
|
||||
"open": 1,
|
||||
"merge_rate": 0.55,
|
||||
"rejection_rate": 0.45,
|
||||
"avg_hours_to_merge": 2.7,
|
||||
"avg_hours_to_close": 8.0,
|
||||
"repos": [
|
||||
"burn-fleet",
|
||||
"hermes-agent",
|
||||
"the-beacon",
|
||||
"the-nexus",
|
||||
"timmy-config",
|
||||
"timmy-home"
|
||||
]
|
||||
},
|
||||
"allegro": {
|
||||
"total_prs": 21,
|
||||
"merged": 7,
|
||||
"closed_unmerged": 11,
|
||||
"open": 3,
|
||||
"merge_rate": 0.389,
|
||||
"rejection_rate": 0.611,
|
||||
"avg_hours_to_merge": 31.1,
|
||||
"avg_hours_to_close": 20.2,
|
||||
"repos": [
|
||||
"burn-fleet",
|
||||
"hermes-agent",
|
||||
"the-beacon",
|
||||
"the-nexus",
|
||||
"timmy-config",
|
||||
"timmy-home"
|
||||
]
|
||||
},
|
||||
"ezra": {
|
||||
"total_prs": 8,
|
||||
"merged": 2,
|
||||
"closed_unmerged": 3,
|
||||
"open": 3,
|
||||
"merge_rate": 0.4,
|
||||
"rejection_rate": 0.6,
|
||||
"avg_hours_to_merge": 4.4,
|
||||
"avg_hours_to_close": 16.8,
|
||||
"repos": [
|
||||
"burn-fleet",
|
||||
"fleet-ops",
|
||||
"timmy-config",
|
||||
"timmy-home"
|
||||
]
|
||||
},
|
||||
"kimi": {
|
||||
"total_prs": 6,
|
||||
"merged": 3,
|
||||
"closed_unmerged": 3,
|
||||
"open": 0,
|
||||
"merge_rate": 0.5,
|
||||
"rejection_rate": 0.5,
|
||||
"avg_hours_to_merge": 39.5,
|
||||
"avg_hours_to_close": 0.5,
|
||||
"repos": [
|
||||
"hermes-agent",
|
||||
"the-nexus",
|
||||
"timmy-home"
|
||||
]
|
||||
},
|
||||
"manus": {
|
||||
"total_prs": 6,
|
||||
"merged": 5,
|
||||
"closed_unmerged": 1,
|
||||
"open": 0,
|
||||
"merge_rate": 0.833,
|
||||
"rejection_rate": 0.167,
|
||||
"avg_hours_to_merge": 0.0,
|
||||
"avg_hours_to_close": 18.8,
|
||||
"repos": [
|
||||
"the-nexus",
|
||||
"timmy-config"
|
||||
]
|
||||
},
|
||||
"codex": {
|
||||
"total_prs": 2,
|
||||
"merged": 2,
|
||||
"closed_unmerged": 0,
|
||||
"open": 0,
|
||||
"merge_rate": 1.0,
|
||||
"rejection_rate": 0.0,
|
||||
"avg_hours_to_merge": 2.3,
|
||||
"avg_hours_to_close": null,
|
||||
"repos": [
|
||||
"timmy-config",
|
||||
"timmy-home"
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
3
timmy-local/hardware/README.md
Normal file
3
timmy-local/hardware/README.md
Normal file
@@ -0,0 +1,3 @@
|
||||
# hardware MCP config
|
||||
|
||||
Copy `hardware_mcp_config.yaml` to `~/.timmy/hardware/hardware_mcp_config.yaml` to enable runtime tuning.
|
||||
67
timmy-local/hardware_mcp_config.yaml
Normal file
67
timmy-local/hardware_mcp_config.yaml
Normal file
@@ -0,0 +1,67 @@
|
||||
# ═══════════════════════════════════════════════════════════════════════
|
||||
# Local Hardware MCP — Runtime Configuration
|
||||
# ═══════════════════════════════════════════════════════════════════════
|
||||
# Edit this file to tune hardware control settings.
|
||||
# Hermes loads this at session start when the hardware MCP server is enabled.
|
||||
#
|
||||
# Location: ~/.timmy/hardware/hardware_mcp_config.yaml
|
||||
# ═══════════════════════════════════════════════════════════════════════
|
||||
|
||||
# ── Server Identity ───────────────────────────────────────────────────
|
||||
server_key: hardware
|
||||
|
||||
# ── Tool Names ────────────────────────────────────────────────────────
|
||||
# Exact tool names Hermes registers. Update if you rename tools in
|
||||
# hardware_mcp_server.py.
|
||||
tools:
|
||||
- name: file_read
|
||||
hint: "Read a file from an allowed directory (home, /tmp). Max 10 MB."
|
||||
- name: file_write
|
||||
hint: "Write text content to a file within allowed directories."
|
||||
- name: file_list
|
||||
hint: "List files and directories in a given folder."
|
||||
- name: light_list
|
||||
hint: "List all Hue lights, rooms, and scenes from OpenHue."
|
||||
- name: light_control
|
||||
hint: "Control a specific Hue light: on/off, brightness, color, temperature."
|
||||
- name: room_control
|
||||
hint: "Control all lights in a room: on/off, brightness."
|
||||
- name: scene_set
|
||||
hint: "Activate a Hue scene in a room."
|
||||
- name: system_info
|
||||
hint: "Get safe system information: OS, CPU count, memory usage, disk space."
|
||||
|
||||
# ── Security Guards ───────────────────────────────────────────────────
|
||||
guards:
|
||||
# Maximum consecutive tool errors before stopping.
|
||||
max_consecutive_errors: 3
|
||||
|
||||
# Max total hardware MCP calls per session (0 = unlimited).
|
||||
max_mcp_calls_per_session: 0
|
||||
|
||||
# Allowed directories for file operations (expanded paths).
|
||||
allowed_dirs:
|
||||
- "~"
|
||||
- "/tmp"
|
||||
- "/private/tmp"
|
||||
|
||||
# Maximum file size for reads (bytes).
|
||||
max_file_size_bytes: 10485760 # 10 MB
|
||||
|
||||
# ── OpenHue ───────────────────────────────────────────────────────────
|
||||
# Path to openhue CLI (auto-detected if in PATH).
|
||||
openhue_command: "openhue"
|
||||
|
||||
# ── Dependencies ───────────────────────────────────────────────────────
|
||||
# Prerequisites:
|
||||
# - OpenHue CLI: brew install openhue/cli/openhue (macOS) or see https://github.com/openhue/openhue-cli
|
||||
# - MCP SDK: pip install mcp
|
||||
# - For system_info: pip install psutil (optional, for detailed memory/disk metrics)
|
||||
#
|
||||
# Config in ~/.hermes/config.yaml:
|
||||
# mcp_servers:
|
||||
# hardware:
|
||||
# command: "python"
|
||||
# args: ["/Users/you/path/to/timmy-home/scripts/hardware_mcp_server.py"]
|
||||
# env:
|
||||
# OPENHUE_BRIDGE_IP: "192.168.1.xx" # optional, if openhue needs it
|
||||
Reference in New Issue
Block a user