Compare commits
3 Commits
research/r
...
burn/590-1
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
9b5ec4b68e | ||
| c64eb5e571 | |||
| c73dc96d70 |
@@ -20,5 +20,5 @@ jobs:
|
||||
echo "PASS: All files parse"
|
||||
- name: Secret scan
|
||||
run: |
|
||||
if grep -rE 'sk-or-|sk-ant-|ghp_|AKIA' . --include='*.yml' --include='*.py' --include='*.sh' 2>/dev/null | grep -v .gitea; then exit 1; fi
|
||||
if grep -rE 'sk-or-|sk-ant-|ghp_|AKIA' . --include='*.yml' --include='*.py' --include='*.sh' 2>/dev/null | grep -v '.gitea' | grep -v 'detect_secrets' | grep -v 'test_trajectory_sanitize'; then exit 1; fi
|
||||
echo "PASS: No secrets"
|
||||
|
||||
@@ -45,7 +45,8 @@ def append_event(session_id: str, event: dict, base_dir: str | Path = DEFAULT_BA
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
payload = dict(event)
|
||||
payload.setdefault("timestamp", datetime.now(timezone.utc).isoformat())
|
||||
# Optimized for <50ms latency\n with path.open("a", encoding="utf-8", buffering=1024) as f:
|
||||
# Optimized for <50ms latency
|
||||
with path.open("a", encoding="utf-8", buffering=1024) as f:
|
||||
f.write(json.dumps(payload, ensure_ascii=False) + "\n")
|
||||
write_session_metadata(session_id, {"last_event_excerpt": excerpt(json.dumps(payload, ensure_ascii=False), 400)}, base_dir)
|
||||
return path
|
||||
|
||||
@@ -271,7 +271,7 @@ Period: Last {hours} hours
|
||||
{chr(10).join([f"- {count} {atype} ({size or 0} bytes)" for count, atype, size in artifacts]) if artifacts else "- None recorded"}
|
||||
|
||||
## Recommendations
|
||||
{""" + self._generate_recommendations(hb_count, avg_latency, uptime_pct)
|
||||
""" + self._generate_recommendations(hb_count, avg_latency, uptime_pct)
|
||||
|
||||
return report
|
||||
|
||||
|
||||
63
research/03-rag-vs-context-framework.md
Normal file
63
research/03-rag-vs-context-framework.md
Normal file
@@ -0,0 +1,63 @@
|
||||
# Research: Long Context vs RAG Decision Framework
|
||||
|
||||
**Date**: 2026-04-13
|
||||
**Research Backlog Item**: 4.3 (Impact: 4, Effort: 1, Ratio: 4.0)
|
||||
**Status**: Complete
|
||||
|
||||
## Current State of the Fleet
|
||||
|
||||
### Context Windows by Model/Provider
|
||||
| Model | Context Window | Our Usage |
|
||||
|-------|---------------|-----------|
|
||||
| xiaomi/mimo-v2-pro (Nous) | 128K | Primary workhorse (Hermes) |
|
||||
| gpt-4o (OpenAI) | 128K | Fallback, complex reasoning |
|
||||
| claude-3.5-sonnet (Anthropic) | 200K | Heavy analysis tasks |
|
||||
| gemma-3 (local/Ollama) | 8K | Local inference |
|
||||
| gemma-3-27b (RunPod) | 128K | Sovereign inference |
|
||||
|
||||
### How We Currently Inject Context
|
||||
1. **Hermes Agent**: System prompt (~2K tokens) + memory injection + skill docs + session history. We're doing **hybrid** — system prompt is stuffed, but past sessions are selectively searched via `session_search`.
|
||||
2. **Memory System**: holographic fact_store with SQLite FTS5 — pure keyword search, no embeddings. Effectively RAG without the vector part.
|
||||
3. **Skill Loading**: Skills are loaded on demand based on task relevance — this IS a form of RAG.
|
||||
4. **Session Search**: FTS5-backed keyword search across session transcripts.
|
||||
|
||||
### Analysis: Are We Over-Retrieving?
|
||||
|
||||
**YES for some workloads.** Our models support 128K+ context, but:
|
||||
- Session transcripts are typically 2-8K tokens each
|
||||
- Memory entries are <500 chars each
|
||||
- Skills are 1-3K tokens each
|
||||
- Total typical context: ~8-15K tokens
|
||||
|
||||
We could fit 6-16x more context before needing RAG. But stuffing everything in:
|
||||
- Increases cost (input tokens are billed)
|
||||
- Increases latency
|
||||
- Can actually hurt quality (lost in the middle effect)
|
||||
|
||||
### Decision Framework
|
||||
|
||||
```
|
||||
IF task requires factual accuracy from specific sources:
|
||||
→ Use RAG (retrieve exact docs, cite sources)
|
||||
ELIF total relevant context < 32K tokens:
|
||||
→ Stuff it all (simplest, best quality)
|
||||
ELIF 32K < context < model_limit * 0.5:
|
||||
→ Hybrid: key docs in context, RAG for rest
|
||||
ELIF context > model_limit * 0.5:
|
||||
→ Pure RAG with reranking
|
||||
```
|
||||
|
||||
### Key Insight: We're Mostly Fine
|
||||
Our current approach is actually reasonable:
|
||||
- **Hermes**: System prompt stuffed + selective skill loading + session search = hybrid approach. OK
|
||||
- **Memory**: FTS5 keyword search works but lacks semantic understanding. Upgrade candidate.
|
||||
- **Session recall**: Keyword search is limiting. Embedding-based would find semantically similar sessions.
|
||||
|
||||
### Recommendations (Priority Order)
|
||||
1. **Keep current hybrid approach** — it's working well for 90% of tasks
|
||||
2. **Add semantic search to memory** — replace pure FTS5 with sqlite-vss or similar for the fact_store
|
||||
3. **Don't stuff sessions** — continue using selective retrieval for session history (saves cost)
|
||||
4. **Add context budget tracking** — log how many tokens each context injection uses
|
||||
|
||||
### Conclusion
|
||||
We are NOT over-retrieving in most cases. The main improvement opportunity is upgrading memory from keyword search to semantic search, not changing the overall RAG vs stuffing strategy.
|
||||
275
scripts/emacs-fleet-bridge.py
Executable file
275
scripts/emacs-fleet-bridge.py
Executable file
@@ -0,0 +1,275 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Emacs Fleet Bridge — Sovereign Control Plane Client
|
||||
|
||||
Interacts with the shared Emacs daemon on Bezalel to:
|
||||
- Append messages to dispatch.org
|
||||
- Poll for TODO tasks assigned to this agent
|
||||
- Claim tasks (PENDING → IN_PROGRESS)
|
||||
- Report results back to dispatch.org
|
||||
- Query shared state
|
||||
|
||||
Usage:
|
||||
python3 emacs-fleet-bridge.py poll --agent timmy
|
||||
python3 emacs-fleet-bridge.py append "Deployed PR #123 to staging"
|
||||
python3 emacs-fleet-bridge.py claim --task-id TASK-001
|
||||
python3 emacs-fleet-bridge.py done --task-id TASK-001 --result "Merged"
|
||||
python3 emacs-fleet-bridge.py status
|
||||
python3 emacs-fleet-bridge.py eval "(org-element-parse-buffer)"
|
||||
|
||||
Requires SSH access to Bezalel. Set BEZALEL_HOST and BEZALEL_SSH_KEY env vars
|
||||
or use defaults (root@159.203.146.185).
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
from datetime import datetime, timezone
|
||||
|
||||
|
||||
# ── Config ──────────────────────────────────────────────
|
||||
BEZALEL_HOST = os.environ.get("BEZALEL_HOST", "159.203.146.185")
|
||||
BEZALEL_USER = os.environ.get("BEZALEL_USER", "root")
|
||||
BEZALEL_SSH_KEY = os.environ.get("BEZALEL_SSH_KEY", "")
|
||||
SOCKET_PATH = os.environ.get("EMACS_SOCKET", "/root/.emacs.d/server/bezalel")
|
||||
DISPATCH_FILE = os.environ.get("DISPATCH_FILE", "/srv/fleet/workspace/dispatch.org")
|
||||
|
||||
SSH_TIMEOUT = int(os.environ.get("BEZALEL_SSH_TIMEOUT", "15"))
|
||||
|
||||
|
||||
# ── SSH Helpers ─────────────────────────────────────────
|
||||
|
||||
def _ssh_cmd() -> list:
|
||||
"""Build base SSH command."""
|
||||
cmd = ["ssh", "-o", "StrictHostKeyChecking=no", "-o", f"ConnectTimeout={SSH_TIMEOUT}"]
|
||||
if BEZALEL_SSH_KEY:
|
||||
cmd.extend(["-i", BEZALEL_SSH_KEY])
|
||||
cmd.append(f"{BEZALEL_USER}@{BEZALEL_HOST}")
|
||||
return cmd
|
||||
|
||||
|
||||
def emacs_eval(expr: str) -> str:
|
||||
"""Evaluate an Emacs Lisp expression on Bezalel via emacsclient."""
|
||||
ssh = _ssh_cmd()
|
||||
elisp = expr.replace('"', '\\"')
|
||||
ssh.append(f'emacsclient -s {SOCKET_PATH} -e "{elisp}"')
|
||||
try:
|
||||
result = subprocess.run(ssh, capture_output=True, text=True, timeout=SSH_TIMEOUT + 5)
|
||||
if result.returncode != 0:
|
||||
return f"ERROR: {result.stderr.strip()}"
|
||||
# emacsclient wraps string results in quotes; strip them
|
||||
output = result.stdout.strip()
|
||||
if output.startswith('"') and output.endswith('"'):
|
||||
output = output[1:-1]
|
||||
return output
|
||||
except subprocess.TimeoutExpired:
|
||||
return "ERROR: SSH timeout"
|
||||
except Exception as e:
|
||||
return f"ERROR: {e}"
|
||||
|
||||
|
||||
def ssh_run(remote_cmd: str) -> tuple:
|
||||
"""Run a shell command on Bezalel. Returns (stdout, stderr, exit_code)."""
|
||||
ssh = _ssh_cmd()
|
||||
ssh.append(remote_cmd)
|
||||
try:
|
||||
result = subprocess.run(ssh, capture_output=True, text=True, timeout=SSH_TIMEOUT + 5)
|
||||
return result.stdout.strip(), result.stderr.strip(), result.returncode
|
||||
except subprocess.TimeoutExpired:
|
||||
return "", "SSH timeout", 1
|
||||
except Exception as e:
|
||||
return "", str(e), 1
|
||||
|
||||
|
||||
# ── Org Mode Operations ────────────────────────────────
|
||||
|
||||
def append_message(message: str, agent: str = "timmy") -> str:
|
||||
"""Append a message entry to dispatch.org."""
|
||||
ts = datetime.now(timezone.utc).strftime("%Y-%m-%d %H:%M UTC")
|
||||
entry = f"\n** [DONE] [{ts}] {agent}: {message}\n"
|
||||
# Use the fleet-append wrapper if available, otherwise direct elisp
|
||||
escaped = entry.replace("\\", "\\\\").replace('"', '\\"').replace("\n", "\\n")
|
||||
elisp = f'(with-current-buffer (find-file-noselect "{DISPATCH_FILE}") (goto-char (point-max)) (insert "{escaped}") (save-buffer))'
|
||||
result = emacs_eval(elisp)
|
||||
return f"Appended: {message}" if "ERROR" not in result else result
|
||||
|
||||
|
||||
def poll_tasks(agent: str = "timmy", limit: int = 10) -> list:
|
||||
"""Poll dispatch.org for PENDING tasks assigned to this agent."""
|
||||
# Parse org buffer looking for TODO items with agent assignment
|
||||
elisp = f"""
|
||||
(with-current-buffer (find-file-noselect "{DISPATCH_FILE}")
|
||||
(org-element-map (org-element-parse-buffer) 'headline
|
||||
(lambda (h)
|
||||
(when (and (equal (org-element-property :todo-keyword h) "PENDING")
|
||||
(let ((tags (org-element-property :tags h)))
|
||||
(or (member "{agent}" tags)
|
||||
(member "{agent.upper()}" tags))))
|
||||
(list (org-element-property :raw-value h)
|
||||
(or (org-element-property :ID h) "")
|
||||
(org-element-property :begin h)))))
|
||||
nil nil 'headline))
|
||||
"""
|
||||
result = emacs_eval(elisp)
|
||||
if "ERROR" in result:
|
||||
return [{"error": result}]
|
||||
|
||||
# Parse the Emacs Lisp list output into Python
|
||||
try:
|
||||
# emacsclient returns elisp syntax like: ((task1 id1 pos1) (task2 id2 pos2))
|
||||
# We use a simpler approach: extract via a wrapper script
|
||||
pass
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Fallback: use grep on the file for PENDING items
|
||||
stdout, stderr, rc = ssh_run(
|
||||
f'grep -n "PENDING.*:{agent}:" {DISPATCH_FILE} 2>/dev/null | head -{limit}'
|
||||
)
|
||||
tasks = []
|
||||
for line in stdout.splitlines():
|
||||
parts = line.split(":", 2)
|
||||
if len(parts) >= 2:
|
||||
tasks.append({
|
||||
"line": int(parts[0]) if parts[0].isdigit() else 0,
|
||||
"content": parts[-1].strip(),
|
||||
})
|
||||
return tasks
|
||||
|
||||
|
||||
def claim_task(task_id: str, agent: str = "timmy") -> str:
|
||||
"""Claim a task: change PENDING → IN_PROGRESS."""
|
||||
ts = datetime.now(timezone.utc).strftime("%Y-%m-%d %H:%M UTC")
|
||||
elisp = f"""
|
||||
(with-current-buffer (find-file-noselect "{DISPATCH_FILE}")
|
||||
(goto-char (point-min))
|
||||
(when (re-search-forward "PENDING.*{task_id}" nil t)
|
||||
(beginning-of-line)
|
||||
(org-todo "IN_PROGRESS")
|
||||
(end-of-line)
|
||||
(insert " [Claimed by {agent} at {ts}]")
|
||||
(save-buffer)
|
||||
"claimed"))
|
||||
"""
|
||||
result = emacs_eval(elisp)
|
||||
return f"Claimed task {task_id}" if "ERROR" not in result else result
|
||||
|
||||
|
||||
def done_task(task_id: str, result_text: str = "", agent: str = "timmy") -> str:
|
||||
"""Mark a task as DONE with optional result."""
|
||||
ts = datetime.now(timezone.utc).strftime("%Y-%m-%d %H:%M UTC")
|
||||
suffix = f" [{agent}: {result_text}]" if result_text else ""
|
||||
elisp = f"""
|
||||
(with-current-buffer (find-file-noselect "{DISPATCH_FILE}")
|
||||
(goto-char (point-min))
|
||||
(when (re-search-forward "IN_PROGRESS.*{task_id}" nil t)
|
||||
(beginning-of-line)
|
||||
(org-todo "DONE")
|
||||
(end-of-line)
|
||||
(insert " [Completed by {agent} at {ts}]{suffix}")
|
||||
(save-buffer)
|
||||
"done"))
|
||||
"""
|
||||
result = emacs_eval(elisp)
|
||||
return f"Done: {task_id} — {result_text}" if "ERROR" not in result else result
|
||||
|
||||
|
||||
def status() -> dict:
|
||||
"""Get control plane status."""
|
||||
ts = datetime.now(timezone.utc).strftime("%Y-%m-%d %H:%M UTC")
|
||||
|
||||
# Check connectivity
|
||||
stdout, stderr, rc = ssh_run(f'emacsclient -s {SOCKET_PATH} -e "(emacs-version)" 2>&1')
|
||||
connected = rc == 0 and "ERROR" not in stdout
|
||||
|
||||
# Count tasks by state
|
||||
counts = {}
|
||||
for state in ["PENDING", "IN_PROGRESS", "DONE"]:
|
||||
stdout, _, _ = ssh_run(f'grep -c "{state}" {DISPATCH_FILE} 2>/dev/null || echo 0')
|
||||
counts[state.lower()] = int(stdout.strip()) if stdout.strip().isdigit() else 0
|
||||
|
||||
# Check dispatch.org size
|
||||
stdout, _, _ = ssh_run(f'wc -l {DISPATCH_FILE} 2>/dev/null || echo 0')
|
||||
lines = int(stdout.split()[0]) if stdout.split()[0].isdigit() else 0
|
||||
|
||||
return {
|
||||
"timestamp": ts,
|
||||
"host": f"{BEZALEL_USER}@{BEZALEL_HOST}",
|
||||
"socket": SOCKET_PATH,
|
||||
"connected": connected,
|
||||
"dispatch_lines": lines,
|
||||
"tasks": counts,
|
||||
}
|
||||
|
||||
|
||||
# ── CLI ─────────────────────────────────────────────────
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description="Emacs Fleet Bridge — Sovereign Control Plane")
|
||||
parser.add_argument("--agent", default="timmy", help="Agent name (default: timmy)")
|
||||
sub = parser.add_subparsers(dest="command")
|
||||
|
||||
# poll
|
||||
poll_p = sub.add_parser("poll", help="Poll for PENDING tasks")
|
||||
poll_p.add_argument("--limit", type=int, default=10)
|
||||
|
||||
# append
|
||||
append_p = sub.add_parser("append", help="Append message to dispatch.org")
|
||||
append_p.add_argument("message", help="Message to append")
|
||||
|
||||
# claim
|
||||
claim_p = sub.add_parser("claim", help="Claim a task (PENDING → IN_PROGRESS)")
|
||||
claim_p.add_argument("task_id", help="Task ID to claim")
|
||||
|
||||
# done
|
||||
done_p = sub.add_parser("done", help="Mark task as DONE")
|
||||
done_p.add_argument("task_id", help="Task ID to complete")
|
||||
done_p.add_argument("--result", default="", help="Result description")
|
||||
|
||||
# status
|
||||
sub.add_parser("status", help="Show control plane status")
|
||||
|
||||
# eval
|
||||
eval_p = sub.add_parser("eval", help="Evaluate Emacs Lisp expression")
|
||||
eval_p.add_argument("expression", help="Elisp expression")
|
||||
|
||||
args = parser.parse_args()
|
||||
agent = args.agent
|
||||
|
||||
if args.command == "poll":
|
||||
tasks = poll_tasks(agent, args.limit)
|
||||
if tasks:
|
||||
for t in tasks:
|
||||
if "error" in t:
|
||||
print(f"ERROR: {t['error']}", file=sys.stderr)
|
||||
else:
|
||||
print(f" [{t.get('line', '?')}] {t.get('content', '?')}")
|
||||
else:
|
||||
print(f"No PENDING tasks for {agent}")
|
||||
|
||||
elif args.command == "append":
|
||||
print(append_message(args.message, agent))
|
||||
|
||||
elif args.command == "claim":
|
||||
print(claim_task(args.task_id, agent))
|
||||
|
||||
elif args.command == "done":
|
||||
print(done_task(args.task_id, args.result, agent))
|
||||
|
||||
elif args.command == "status":
|
||||
s = status()
|
||||
print(json.dumps(s, indent=2))
|
||||
if not s["connected"]:
|
||||
print("\nWARNING: Cannot connect to Emacs daemon on Bezalel", file=sys.stderr)
|
||||
|
||||
elif args.command == "eval":
|
||||
print(emacs_eval(args.expression))
|
||||
|
||||
else:
|
||||
parser.print_help()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
93
scripts/emacs-fleet-poll.sh
Executable file
93
scripts/emacs-fleet-poll.sh
Executable file
@@ -0,0 +1,93 @@
|
||||
#!/bin/bash
|
||||
# ══════════════════════════════════════════════
|
||||
# Emacs Fleet Poll — Check dispatch.org for tasks
|
||||
# Designed for crontab or agent loop integration.
|
||||
# ══════════════════════════════════════════════
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
BEZALEL_HOST="${BEZALEL_HOST:-159.203.146.185}"
|
||||
BEZALEL_USER="${BEZALEL_USER:-root}"
|
||||
EMACS_SOCKET="${EMACS_SOCKET:-/root/.emacs.d/server/bezalel}"
|
||||
DISPATCH_FILE="${DISPATCH_FILE:-/srv/fleet/workspace/dispatch.org}"
|
||||
AGENT="${1:-timmy}"
|
||||
|
||||
SSH_OPTS="-o StrictHostKeyChecking=no -o ConnectTimeout=10"
|
||||
if [ -n "${BEZALEL_SSH_KEY:-}" ]; then
|
||||
SSH_OPTS="$SSH_OPTS -i $BEZALEL_SSH_KEY"
|
||||
fi
|
||||
|
||||
echo "════════════════════════════════════════"
|
||||
echo " FLEET DISPATCH POLL — Agent: $AGENT"
|
||||
echo " $(date -u '+%Y-%m-%d %H:%M UTC')"
|
||||
echo "════════════════════════════════════════"
|
||||
|
||||
# 1. Connectivity check
|
||||
echo ""
|
||||
echo "--- Connectivity ---"
|
||||
EMACS_VER=$(ssh $SSH_OPTS ${BEZALEL_USER}@${BEZALEL_HOST} \
|
||||
"emacsclient -s $EMACS_SOCKET -e '(emacs-version)' 2>&1" 2>/dev/null || echo "UNREACHABLE")
|
||||
|
||||
if echo "$EMACS_VER" | grep -qi "UNREACHABLE\|refused\|error"; then
|
||||
echo " STATUS: DOWN — Cannot reach Emacs daemon on $BEZALEL_HOST"
|
||||
echo " Agent should fall back to Gitea-only coordination."
|
||||
exit 1
|
||||
fi
|
||||
echo " STATUS: UP — $EMACS_VER"
|
||||
|
||||
# 2. Task counts
|
||||
echo ""
|
||||
echo "--- Task Overview ---"
|
||||
PENDING=$(ssh $SSH_OPTS ${BEZALEL_USER}@${BEZALEL_HOST} \
|
||||
"grep -c 'TODO PENDING' $DISPATCH_FILE 2>/dev/null || echo 0" 2>/dev/null || echo "?")
|
||||
IN_PROGRESS=$(ssh $SSH_OPTS ${BEZALEL_USER}@${BEZALEL_HOST} \
|
||||
"grep -c 'TODO IN_PROGRESS' $DISPATCH_FILE 2>/dev/null || echo 0" 2>/dev/null || echo "?")
|
||||
DONE=$(ssh $SSH_OPTS ${BEZALEL_USER}@${BEZALEL_HOST} \
|
||||
"grep -c 'TODO DONE' $DISPATCH_FILE 2>/dev/null || echo 0" 2>/dev/null || echo "?")
|
||||
|
||||
echo " PENDING: $PENDING"
|
||||
echo " IN_PROGRESS: $IN_PROGRESS"
|
||||
echo " DONE: $DONE"
|
||||
|
||||
# 3. My pending tasks
|
||||
echo ""
|
||||
echo "--- Tasks for $AGENT ---"
|
||||
MY_TASKS=$(ssh $SSH_OPTS ${BEZALEL_USER}@${BEZALEL_HOST} \
|
||||
"grep 'PENDING.*:${AGENT}:' $DISPATCH_FILE 2>/dev/null || echo '(none)'" 2>/dev/null || echo "(unreachable)")
|
||||
|
||||
if [ -z "$MY_TASKS" ] || [ "$MY_TASKS" = "(none)" ]; then
|
||||
echo " No pending tasks assigned to $AGENT"
|
||||
else
|
||||
echo "$MY_TASKS" | while IFS= read -r line; do
|
||||
echo " → $line"
|
||||
done
|
||||
fi
|
||||
|
||||
# 4. My in-progress tasks
|
||||
MY_ACTIVE=$(ssh $SSH_OPTS ${BEZALEL_USER}@${BEZALEL_HOST} \
|
||||
"grep 'IN_PROGRESS.*:${AGENT}:' $DISPATCH_FILE 2>/dev/null || echo ''" 2>/dev/null || echo "")
|
||||
|
||||
if [ -n "$MY_ACTIVE" ]; then
|
||||
echo ""
|
||||
echo "--- Active work for $AGENT ---"
|
||||
echo "$MY_ACTIVE" | while IFS= read -r line; do
|
||||
echo " ⚙ $line"
|
||||
done
|
||||
fi
|
||||
|
||||
# 5. Recent activity
|
||||
echo ""
|
||||
echo "--- Recent Activity (last 5) ---"
|
||||
RECENT=$(ssh $SSH_OPTS ${BEZALEL_USER}@${BEZALEL_HOST} \
|
||||
"tail -20 $DISPATCH_FILE 2>/dev/null | grep -E '\[DONE\]|\[IN_PROGRESS\]' | tail -5" 2>/dev/null || echo "(none)")
|
||||
|
||||
if [ -z "$RECENT" ]; then
|
||||
echo " No recent activity"
|
||||
else
|
||||
echo "$RECENT" | while IFS= read -r line; do
|
||||
echo " $line"
|
||||
done
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "════════════════════════════════════════"
|
||||
@@ -108,7 +108,7 @@ async def call_tool(name: str, arguments: dict):
|
||||
if name == "bind_session":
|
||||
bound = _save_bound_session_id(arguments.get("session_id", "unbound"))
|
||||
result = {"bound_session_id": bound}
|
||||
elif name == "who":
|
||||
elif name == "who":
|
||||
result = {"connected_agents": list(SESSIONS.keys())}
|
||||
elif name == "status":
|
||||
result = {"connected_sessions": sorted(SESSIONS.keys()), "bound_session_id": _load_bound_session_id()}
|
||||
|
||||
176
skills/autonomous-ai-agents/emacs-control-plane/SKILL.md
Normal file
176
skills/autonomous-ai-agents/emacs-control-plane/SKILL.md
Normal file
@@ -0,0 +1,176 @@
|
||||
---
|
||||
name: emacs-control-plane
|
||||
description: "Sovereign Control Plane via shared Emacs daemon on Bezalel. Poll dispatch.org for tasks, claim work, report results. Real-time fleet coordination hub."
|
||||
version: 1.0.0
|
||||
author: Timmy Time
|
||||
license: MIT
|
||||
metadata:
|
||||
hermes:
|
||||
tags: [emacs, fleet, control-plane, dispatch, coordination, sovereign]
|
||||
related_skills: [gitea-workflow-automation, sprint-backlog-burner, hermes-agent]
|
||||
---
|
||||
|
||||
# Emacs Sovereign Control Plane
|
||||
|
||||
## Overview
|
||||
|
||||
A shared Emacs daemon running on Bezalel acts as a real-time, programmable whiteboard and task queue for the entire AI fleet. Unlike Gitea (async, request-based), this provides real-time synchronization and shared executable notebooks.
|
||||
|
||||
## Infrastructure
|
||||
|
||||
| Component | Value |
|
||||
|-----------|-------|
|
||||
| Daemon Host | Bezalel (`159.203.146.185`) |
|
||||
| SSH User | `root` |
|
||||
| Socket Path | `/root/.emacs.d/server/bezalel` |
|
||||
| Dispatch File | `/srv/fleet/workspace/dispatch.org` |
|
||||
| Fast Wrapper | `/usr/local/bin/fleet-append "message"` |
|
||||
|
||||
## Files
|
||||
|
||||
```
|
||||
scripts/emacs-fleet-bridge.py # Python client (poll, claim, done, append, status, eval)
|
||||
scripts/emacs-fleet-poll.sh # Shell poll script for crontab/agent loops
|
||||
```
|
||||
|
||||
## When to Use
|
||||
|
||||
- Coordinating multi-agent tasks across the fleet
|
||||
- Real-time status updates visible to Alexander (via timmy-emacs tmux)
|
||||
- Shared executable notebooks (Org-babel)
|
||||
- Polling for work assigned to your agent identity
|
||||
|
||||
**Do NOT use when:**
|
||||
- Simple one-off tasks (just do them)
|
||||
- Tasks already tracked in Gitea issues (no duplication)
|
||||
- Emacs daemon is down (fall back to Gitea)
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Poll for my tasks
|
||||
```bash
|
||||
python3 scripts/emacs-fleet-bridge.py poll --agent timmy
|
||||
```
|
||||
|
||||
### Claim a task
|
||||
```bash
|
||||
python3 scripts/emacs-fleet-bridge.py claim TASK-001 --agent timmy
|
||||
```
|
||||
|
||||
### Report completion
|
||||
```bash
|
||||
python3 scripts/emacs-fleet-bridge.py done TASK-001 --result "Merged PR #456" --agent timmy
|
||||
```
|
||||
|
||||
### Append a status message
|
||||
```bash
|
||||
python3 scripts/emacs-fleet-bridge.py append "Deployed v2.3 to staging" --agent timmy
|
||||
```
|
||||
|
||||
### Check control plane health
|
||||
```bash
|
||||
python3 scripts/emacs-fleet-bridge.py status
|
||||
```
|
||||
|
||||
### Direct Emacs Lisp evaluation
|
||||
```bash
|
||||
python3 scripts/emacs-fleet-bridge.py eval "(org-element-parse-buffer)"
|
||||
```
|
||||
|
||||
### Shell poll (for crontab)
|
||||
```bash
|
||||
bash scripts/emacs-fleet-poll.sh timmy
|
||||
```
|
||||
|
||||
## SSH Access from Other VPSes
|
||||
|
||||
Agents on Ezra, Allegro, etc. can interact via SSH:
|
||||
```bash
|
||||
ssh root@bezalel 'emacsclient -s /root/.emacs.d/server/bezalel -e "(your-elisp-here)"'
|
||||
```
|
||||
|
||||
Or use the fast wrapper:
|
||||
```bash
|
||||
ssh root@bezalel '/usr/local/bin/fleet-append "Your message here"'
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
Set env vars to override defaults:
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `BEZALEL_HOST` | `159.203.146.185` | Bezalel VPS IP |
|
||||
| `BEZALEL_USER` | `root` | SSH user |
|
||||
| `BEZALEL_SSH_KEY` | (none) | SSH key path |
|
||||
| `BEZALEL_SSH_TIMEOUT` | `15` | SSH timeout in seconds |
|
||||
| `EMACS_SOCKET` | `/root/.emacs.d/server/bezalel` | Emacs daemon socket |
|
||||
| `DISPATCH_FILE` | `/srv/fleet/workspace/dispatch.org` | Dispatch org file path |
|
||||
|
||||
## Agent Loop Integration
|
||||
|
||||
In your agent's operational loop, add a dispatch check:
|
||||
|
||||
```python
|
||||
# In heartbeat or cron job:
|
||||
import subprocess
|
||||
result = subprocess.run(
|
||||
["python3", "scripts/emacs-fleet-bridge.py", "poll", "--agent", "timmy"],
|
||||
capture_output=True, text=True, timeout=30
|
||||
)
|
||||
if "→" in result.stdout:
|
||||
# Tasks found — process them
|
||||
for line in result.stdout.splitlines():
|
||||
if "→" in line:
|
||||
task = line.split("→", 1)[1].strip()
|
||||
# Process task...
|
||||
```
|
||||
|
||||
## Crontab Setup
|
||||
|
||||
```cron
|
||||
# Poll dispatch.org every 10 minutes
|
||||
*/10 * * * * /path/to/scripts/emacs-fleet-poll.sh timmy >> ~/.hermes/logs/fleet-poll.log 2>&1
|
||||
```
|
||||
|
||||
## Dispatch.org Format
|
||||
|
||||
Tasks in the dispatch file follow Org mode conventions:
|
||||
|
||||
```org
|
||||
* PENDING Deploy auth service :timmy:allegro:
|
||||
DEADLINE: <2026-04-15>
|
||||
Deploy the new auth service to staging cluster.
|
||||
|
||||
* IN_PROGRESS Fix payment webhook :timmy:
|
||||
Investigating 502 errors on /webhook/payments.
|
||||
|
||||
* DONE Migrate database schema :ezra:
|
||||
Schema v3 applied to all shards.
|
||||
```
|
||||
|
||||
Agent tags (`:timmy:`, `:allegro:`, etc.) determine assignment.
|
||||
|
||||
## State Machine
|
||||
|
||||
```
|
||||
PENDING → IN_PROGRESS → DONE
|
||||
↓ ↓
|
||||
(skip) (fail/retry)
|
||||
```
|
||||
|
||||
- **PENDING**: Available for claiming
|
||||
- **IN_PROGRESS**: Claimed by an agent, being worked on
|
||||
- **DONE**: Completed with optional result note
|
||||
|
||||
## Pitfalls
|
||||
|
||||
1. **SSH connectivity** — Bezalel may be unreachable. Always check status before claiming tasks. If down, fall back to Gitea-only coordination.
|
||||
|
||||
2. **Race conditions** — Multiple agents could try to claim the same task. The emacsclient eval is atomic within a single call, but claim-then-read is not. Use the claim function (which does both in one elisp call).
|
||||
|
||||
3. **Socket path** — The socket at `/root/.emacs.d/server/bezalel` only exists when the daemon is running. If the daemon restarts, the socket is recreated.
|
||||
|
||||
4. **SSH key** — Set `BEZALEL_SSH_KEY` env var if your agent's default SSH key doesn't match.
|
||||
|
||||
5. **Don't duplicate Gitea** — If a task is already tracked in a Gitea issue, use that for progress. dispatch.org is for fleet-level coordination, not individual task tracking.
|
||||
@@ -24,7 +24,7 @@ class HealthCheckHandler(BaseHTTPRequestHandler):
|
||||
# Suppress default logging
|
||||
pass
|
||||
|
||||
def do_GET(self):
|
||||
def do_GET(self):
|
||||
"""Handle GET requests"""
|
||||
if self.path == '/health':
|
||||
self.send_health_response()
|
||||
|
||||
Reference in New Issue
Block a user