feat: add Docker-based swarm agent containerization
Add infrastructure for running swarm agents as isolated Docker containers with HTTP-based coordination, startup recovery, and enhanced dashboard UI for agent management. - Dockerfile and docker-compose.yml for multi-service orchestration - DockerAgentRunner for programmatic container lifecycle management - Internal HTTP API for container agents to poll tasks and submit bids - Startup recovery system to reconcile orphaned tasks and stale agents - Enhanced UI partials for agent panels, chat, and task assignment - Timmy docker entry point with heartbeat and task polling - New Makefile targets for Docker workflows - Tests for swarm recovery Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
37
.dockerignore
Normal file
37
.dockerignore
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
# ── Python ───────────────────────────────────────────────────────────────────
|
||||||
|
.venv/
|
||||||
|
__pycache__/
|
||||||
|
*.pyc
|
||||||
|
*.pyo
|
||||||
|
*.pyd
|
||||||
|
*.egg-info/
|
||||||
|
dist/
|
||||||
|
build/
|
||||||
|
.pytest_cache/
|
||||||
|
htmlcov/
|
||||||
|
.coverage
|
||||||
|
coverage.xml
|
||||||
|
|
||||||
|
# ── Data (mounted as volume, not baked in) ───────────────────────────────────
|
||||||
|
data/
|
||||||
|
*.db
|
||||||
|
|
||||||
|
# ── Secrets / config ─────────────────────────────────────────────────────────
|
||||||
|
.env
|
||||||
|
.env.*
|
||||||
|
*.key
|
||||||
|
*.pem
|
||||||
|
|
||||||
|
# ── Git ───────────────────────────────────────────────────────────────────────
|
||||||
|
.git/
|
||||||
|
.gitignore
|
||||||
|
|
||||||
|
# ── Tests (not needed in production image) ───────────────────────────────────
|
||||||
|
tests/
|
||||||
|
|
||||||
|
# ── Docs ─────────────────────────────────────────────────────────────────────
|
||||||
|
docs/
|
||||||
|
*.md
|
||||||
|
|
||||||
|
# ── macOS ─────────────────────────────────────────────────────────────────────
|
||||||
|
.DS_Store
|
||||||
254
AGENTS.md
254
AGENTS.md
@@ -1,7 +1,7 @@
|
|||||||
# AGENTS.md — Timmy Time Development Standards for AI Agents
|
# AGENTS.md — Timmy Time Development Standards for AI Agents
|
||||||
|
|
||||||
This file is the authoritative reference for any AI agent (Claude, Kimi, Manus,
|
This file is the authoritative reference for any AI agent contributing to
|
||||||
or future tools) contributing to this repository. Read it first. Every time.
|
this repository. Read it first. Every time.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -10,15 +10,16 @@ or future tools) contributing to this repository. Read it first. Every time.
|
|||||||
**Timmy Time** is a local-first, sovereign AI agent system. No cloud. No telemetry.
|
**Timmy Time** is a local-first, sovereign AI agent system. No cloud. No telemetry.
|
||||||
Bitcoin Lightning economics baked in.
|
Bitcoin Lightning economics baked in.
|
||||||
|
|
||||||
| Thing | Value |
|
| Thing | Value |
|
||||||
|------------------|----------------------------------------|
|
|------------------|----------------------------------------------------|
|
||||||
| Language | Python 3.11+ |
|
| Language | Python 3.11+ |
|
||||||
| Web framework | FastAPI + Jinja2 + HTMX |
|
| Web framework | FastAPI + Jinja2 + HTMX |
|
||||||
| Agent framework | Agno (wraps Ollama or AirLLM) |
|
| Agent framework | Agno (wraps Ollama or AirLLM) |
|
||||||
| Persistence | SQLite (`timmy.db`, `data/swarm.db`) |
|
| Persistence | SQLite (`timmy.db`, `data/swarm.db`) |
|
||||||
| Tests | pytest — 228 passing, **must stay green** |
|
| Tests | pytest — must stay green |
|
||||||
| Entry points | `timmy`, `timmy-serve`, `self-tdd` |
|
| Entry points | `timmy`, `timmy-serve`, `self-tdd` |
|
||||||
| Config | pydantic-settings, reads `.env` |
|
| Config | pydantic-settings, reads `.env` |
|
||||||
|
| Containers | Docker — each agent can run as an isolated service |
|
||||||
|
|
||||||
```
|
```
|
||||||
src/
|
src/
|
||||||
@@ -28,9 +29,11 @@ src/
|
|||||||
app.py
|
app.py
|
||||||
store.py # In-memory MessageLog singleton
|
store.py # In-memory MessageLog singleton
|
||||||
routes/ # agents, health, swarm, swarm_ws, marketplace,
|
routes/ # agents, health, swarm, swarm_ws, marketplace,
|
||||||
│ # mobile, mobile_test, voice, voice_enhanced
|
│ # mobile, mobile_test, voice, voice_enhanced,
|
||||||
|
│ # swarm_internal (HTTP API for Docker agents)
|
||||||
templates/ # base.html + page templates + partials/
|
templates/ # base.html + page templates + partials/
|
||||||
swarm/ # Multi-agent coordinator, registry, bidder, tasks, comms
|
swarm/ # Multi-agent coordinator, registry, bidder, tasks, comms
|
||||||
|
docker_runner.py # Spawn agents as Docker containers
|
||||||
timmy_serve/ # L402 Lightning proxy, payment handler, TTS, CLI
|
timmy_serve/ # L402 Lightning proxy, payment handler, TTS, CLI
|
||||||
voice/ # NLU intent detection (regex-based, no cloud)
|
voice/ # NLU intent detection (regex-based, no cloud)
|
||||||
websocket/ # WebSocket manager (ws_manager singleton)
|
websocket/ # WebSocket manager (ws_manager singleton)
|
||||||
@@ -38,68 +41,68 @@ src/
|
|||||||
shortcuts/ # Siri Shortcuts API endpoints
|
shortcuts/ # Siri Shortcuts API endpoints
|
||||||
self_tdd/ # Continuous test watchdog
|
self_tdd/ # Continuous test watchdog
|
||||||
tests/ # One test_*.py per module, all mocked
|
tests/ # One test_*.py per module, all mocked
|
||||||
static/style.css # Dark mission-control theme (JetBrains Mono)
|
static/ # style.css + bg.svg (arcane theme)
|
||||||
docs/ # GitHub Pages site (docs/index.html)
|
docs/ # GitHub Pages site
|
||||||
```
|
```
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 2. Non-Negotiable Rules
|
## 2. Non-Negotiable Rules
|
||||||
|
|
||||||
1. **Tests must stay green.** Run `make test` before committing. If you break
|
1. **Tests must stay green.** Run `make test` before committing.
|
||||||
tests, fix them before you do anything else.
|
2. **No cloud dependencies.** All AI computation runs on localhost.
|
||||||
2. **No cloud dependencies.** All computation must run on localhost.
|
|
||||||
3. **No new top-level files without purpose.** Don't litter the root directory.
|
3. **No new top-level files without purpose.** Don't litter the root directory.
|
||||||
4. **Follow existing patterns** — singletons (`message_log`, `notifier`,
|
4. **Follow existing patterns** — singletons, graceful degradation, pydantic-settings config.
|
||||||
`ws_manager`, `coordinator`), graceful degradation (try/except → fallback),
|
5. **Security defaults:** Never hard-code secrets. Warn at startup when defaults are in use.
|
||||||
pydantic-settings config.
|
6. **XSS prevention:** Never use `innerHTML` with untrusted content.
|
||||||
5. **Security defaults:** Never hard-code secrets. Warn at startup when defaults
|
|
||||||
are in use (see `l402_proxy.py` and `payment_handler.py` for the pattern).
|
|
||||||
6. **XSS prevention:** Never use `innerHTML` with untrusted content. Use
|
|
||||||
`textContent` or `innerText` for any user-controlled string in JS.
|
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 3. Per-Agent Assignments
|
## 3. Agent Roster
|
||||||
|
|
||||||
### Claude (Anthropic)
|
Agents are divided into two tiers: **Builders** generate code and features;
|
||||||
**Strengths:** Architecture, scaffolding, iterative refinement, testing, docs, breadth.
|
**Reviewers** provide quality gates, feedback, and hardening. The Local agent
|
||||||
|
is the primary workhorse — use it as much as possible to minimise cost.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 🏗️ BUILD TIER
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Local — Ollama (primary workhorse)
|
||||||
|
**Model:** Any — `qwen2.5-coder`, `deepseek-coder-v2`, `codellama`, or whatever
|
||||||
|
is loaded in Ollama. The owner decides the model; this agent is unrestricted.
|
||||||
|
**Cost:** Free. Runs on the host machine.
|
||||||
|
|
||||||
**Best for:**
|
**Best for:**
|
||||||
- Adding new subsystems from scratch
|
- Everything. This is the default agent for all coding tasks.
|
||||||
- Refactoring / code-quality passes
|
- Iterative development, fast feedback loops, bulk generation
|
||||||
- Writing or fixing tests
|
- Running as a Docker swarm worker — scales horizontally at zero marginal cost
|
||||||
- Updating documentation (README, AGENTS.md, inline comments)
|
- Experimenting with new models without changing any other code
|
||||||
- CI/CD and tooling
|
|
||||||
- Debugging tricky async or import issues
|
|
||||||
|
|
||||||
**Conventions to follow:**
|
**Conventions to follow:**
|
||||||
- Prefer editing existing files over creating new ones
|
- Communicate with the coordinator over HTTP (`COORDINATOR_URL` env var)
|
||||||
- Keep route files thin — business logic lives in the module, not the route
|
- Register capabilities honestly so the auction system routes tasks well
|
||||||
- Use `from config import settings` for all env-var access
|
- Write tests for anything non-trivial
|
||||||
- New routes go in `src/dashboard/routes/`, registered in `app.py`
|
|
||||||
- New templates extend `base.html`
|
|
||||||
- Always add a corresponding `tests/test_<module>.py`
|
|
||||||
|
|
||||||
**Avoid:**
|
**No restrictions.** If a model can do it, do it.
|
||||||
- Large one-shot feature dumps (that's Kimi's lane)
|
|
||||||
- Touching `src/swarm/coordinator.py` for security work (that's Manus's lane)
|
|
||||||
- Committing with `--no-verify`
|
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
### Kimi (Moonshot AI)
|
### Kimi (Moonshot AI)
|
||||||
**Strengths:** High-volume feature generation, rapid expansion, large context.
|
**Model:** Moonshot large-context models.
|
||||||
|
**Cost:** Paid API.
|
||||||
|
|
||||||
**Best for:**
|
**Best for:**
|
||||||
- Big feature drops (new pages, new subsystems, new agent personas)
|
- Large context feature drops (new pages, new subsystems, new agent personas)
|
||||||
- Implementing the roadmap items listed below
|
- Implementing roadmap items that require reading many files at once
|
||||||
- Generating boilerplate for new agents (Echo, Mace, Helm, Seer, Forge, Quill)
|
- Generating boilerplate for new agents (Echo, Mace, Helm, Seer, Forge, Quill)
|
||||||
|
|
||||||
**Conventions to follow:**
|
**Conventions to follow:**
|
||||||
- Deliver working code with accompanying tests (even if minimal)
|
- Deliver working code with accompanying tests (even if minimal)
|
||||||
- Match the dark Mission Control CSS theme — extend `static/style.css`
|
- Match the arcane CSS theme — extend `static/style.css`
|
||||||
- New agents should follow the `SwarmNode` + `Registry` pattern in `src/swarm/`
|
- New agents follow the `SwarmNode` + `Registry` + Docker pattern
|
||||||
- Lightning-gated endpoints follow the L402 pattern in `src/timmy_serve/l402_proxy.py`
|
- Lightning-gated endpoints follow the L402 pattern in `src/timmy_serve/l402_proxy.py`
|
||||||
|
|
||||||
**Avoid:**
|
**Avoid:**
|
||||||
@@ -109,6 +112,78 @@ docs/ # GitHub Pages site (docs/index.html)
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
### DeepSeek (DeepSeek API)
|
||||||
|
**Model:** `deepseek-chat` (V3) or `deepseek-reasoner` (R1).
|
||||||
|
**Cost:** Near-free (~$0.14/M tokens).
|
||||||
|
|
||||||
|
**Best for:**
|
||||||
|
- Second-opinion feature generation when Kimi is busy or context is smaller
|
||||||
|
- Large refactors with reasoning traces (use R1 for hard problems)
|
||||||
|
- Code review passes before merging Kimi PRs
|
||||||
|
- Anything that doesn't need a frontier model but benefits from strong reasoning
|
||||||
|
|
||||||
|
**Conventions to follow:**
|
||||||
|
- Same conventions as Kimi
|
||||||
|
- Prefer V3 for straightforward tasks; R1 for anything requiring multi-step logic
|
||||||
|
- Submit PRs for review by Claude before merging
|
||||||
|
|
||||||
|
**Avoid:**
|
||||||
|
- Bypassing the review tier for security-sensitive modules
|
||||||
|
- Touching `src/swarm/coordinator.py` without Claude review
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### 🔍 REVIEW TIER
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Claude (Anthropic)
|
||||||
|
**Model:** Claude Sonnet.
|
||||||
|
**Cost:** Paid API.
|
||||||
|
|
||||||
|
**Best for:**
|
||||||
|
- Architecture decisions and code-quality review
|
||||||
|
- Writing and fixing tests; keeping coverage green
|
||||||
|
- Updating documentation (README, AGENTS.md, inline comments)
|
||||||
|
- CI/CD, tooling, Docker infrastructure
|
||||||
|
- Debugging tricky async or import issues
|
||||||
|
- Reviewing PRs from Local, Kimi, and DeepSeek before merge
|
||||||
|
|
||||||
|
**Conventions to follow:**
|
||||||
|
- Prefer editing existing files over creating new ones
|
||||||
|
- Keep route files thin — business logic lives in the module, not the route
|
||||||
|
- Use `from config import settings` for all env-var access
|
||||||
|
- New routes go in `src/dashboard/routes/`, registered in `app.py`
|
||||||
|
- Always add a corresponding `tests/test_<module>.py`
|
||||||
|
|
||||||
|
**Avoid:**
|
||||||
|
- Large one-shot feature dumps (use Local or Kimi)
|
||||||
|
- Touching `src/swarm/coordinator.py` for security work (that's Manus's lane)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Gemini (Google)
|
||||||
|
**Model:** Gemini 2.0 Flash (free tier) or Pro.
|
||||||
|
**Cost:** Free tier generous; upgrade only if needed.
|
||||||
|
|
||||||
|
**Best for:**
|
||||||
|
- Documentation, README updates, inline docstrings
|
||||||
|
- Frontend polish — HTML templates, CSS, accessibility review
|
||||||
|
- Boilerplate generation (test stubs, config files, GitHub Actions)
|
||||||
|
- Summarising large diffs for human review
|
||||||
|
|
||||||
|
**Conventions to follow:**
|
||||||
|
- Submit changes as PRs; always include a plain-English summary of what changed
|
||||||
|
- For CSS changes, test at mobile breakpoint (≤768px) before submitting
|
||||||
|
- Never modify Python business logic without Claude review
|
||||||
|
|
||||||
|
**Avoid:**
|
||||||
|
- Security-sensitive modules (that's Manus's lane)
|
||||||
|
- Changing auction or payment logic
|
||||||
|
- Large Python refactors
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
### Manus AI
|
### Manus AI
|
||||||
**Strengths:** Precision security work, targeted bug fixes, coverage gap analysis.
|
**Strengths:** Precision security work, targeted bug fixes, coverage gap analysis.
|
||||||
|
|
||||||
@@ -126,21 +201,58 @@ docs/ # GitHub Pages site (docs/index.html)
|
|||||||
|
|
||||||
**Avoid:**
|
**Avoid:**
|
||||||
- Large-scale refactors (that's Claude's lane)
|
- Large-scale refactors (that's Claude's lane)
|
||||||
- New feature work (that's Kimi's lane)
|
- New feature work (use Local or Kimi)
|
||||||
- Changing agent personas or prompt content
|
- Changing agent personas or prompt content
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 4. Architecture Patterns
|
## 4. Docker — Running Agents as Containers
|
||||||
|
|
||||||
|
Each agent can run as an isolated Docker container. Containers share the
|
||||||
|
`data/` volume for SQLite and communicate with the coordinator over HTTP.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
make docker-build # build the image
|
||||||
|
make docker-up # start dashboard + deps
|
||||||
|
make docker-agent # spawn one agent worker (LOCAL model)
|
||||||
|
make docker-down # stop everything
|
||||||
|
make docker-logs # tail all service logs
|
||||||
|
```
|
||||||
|
|
||||||
|
### How container agents communicate
|
||||||
|
|
||||||
|
Container agents cannot use the in-memory `SwarmComms` channel. Instead they
|
||||||
|
poll the coordinator's internal HTTP API:
|
||||||
|
|
||||||
|
```
|
||||||
|
GET /internal/tasks → list tasks open for bidding
|
||||||
|
POST /internal/bids → submit a bid
|
||||||
|
```
|
||||||
|
|
||||||
|
Set `COORDINATOR_URL=http://dashboard:8000` in the container environment
|
||||||
|
(docker-compose sets this automatically).
|
||||||
|
|
||||||
|
### Spawning a container agent from Python
|
||||||
|
|
||||||
|
```python
|
||||||
|
from swarm.docker_runner import DockerAgentRunner
|
||||||
|
|
||||||
|
runner = DockerAgentRunner(coordinator_url="http://dashboard:8000")
|
||||||
|
info = runner.spawn("Echo", image="timmy-time:latest")
|
||||||
|
runner.stop(info["container_id"])
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Architecture Patterns
|
||||||
|
|
||||||
### Singletons (module-level instances)
|
### Singletons (module-level instances)
|
||||||
These are shared state — import them, don't recreate them:
|
|
||||||
```python
|
```python
|
||||||
from dashboard.store import message_log # MessageLog
|
from dashboard.store import message_log
|
||||||
from notifications.push import notifier # PushNotifier
|
from notifications.push import notifier
|
||||||
from websocket.handler import ws_manager # WebSocketManager
|
from websocket.handler import ws_manager
|
||||||
from timmy_serve.payment_handler import payment_handler # PaymentHandler
|
from timmy_serve.payment_handler import payment_handler
|
||||||
from swarm.coordinator import coordinator # SwarmCoordinator
|
from swarm.coordinator import coordinator
|
||||||
```
|
```
|
||||||
|
|
||||||
### Config access
|
### Config access
|
||||||
@@ -150,8 +262,6 @@ url = settings.ollama_url # never os.environ.get() directly in route files
|
|||||||
```
|
```
|
||||||
|
|
||||||
### HTMX pattern
|
### HTMX pattern
|
||||||
Server renders HTML fragments. Routes return `TemplateResponse` with a partial
|
|
||||||
template. JS is minimal — no React, no Vue.
|
|
||||||
```python
|
```python
|
||||||
return templates.TemplateResponse(
|
return templates.TemplateResponse(
|
||||||
"partials/chat_message.html",
|
"partials/chat_message.html",
|
||||||
@@ -175,45 +285,44 @@ except Exception:
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 5. Running Locally
|
## 6. Running Locally
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
make install # create venv + install dev deps
|
make install # create venv + install dev deps
|
||||||
make test # run full test suite
|
make test # run full test suite
|
||||||
make dev # start dashboard (http://localhost:8000)
|
make dev # start dashboard (http://localhost:8000)
|
||||||
make watch # self-TDD watchdog (background, 60s interval)
|
make watch # self-TDD watchdog (60s poll)
|
||||||
make test-cov # coverage report
|
make test-cov # coverage report
|
||||||
```
|
```
|
||||||
|
|
||||||
Or manually:
|
Or with Docker:
|
||||||
```bash
|
```bash
|
||||||
python3 -m venv .venv && source .venv/bin/activate
|
make docker-build # build image
|
||||||
pip install -e ".[dev]"
|
make docker-up # start dashboard
|
||||||
pytest # all 228 tests
|
make docker-agent # add a Local agent worker
|
||||||
uvicorn dashboard.app:app --reload --host 0.0.0.0 --port 8000
|
|
||||||
```
|
```
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 6. Roadmap (v2 → v3)
|
## 7. Roadmap (v2 → v3)
|
||||||
|
|
||||||
These are unbuilt items — claim one per PR, coordinate via Issues:
|
|
||||||
|
|
||||||
**v2.0.0 — Exodus (in progress)**
|
**v2.0.0 — Exodus (in progress)**
|
||||||
- [ ] Implement Echo, Mace, Helm, Seer, Forge, Quill agent personas as Agno agents
|
- [x] Persistent swarm state across restarts
|
||||||
|
- [x] Docker infrastructure for agent containers
|
||||||
|
- [ ] Implement Echo, Mace, Helm, Seer, Forge, Quill persona agents (Dockerised)
|
||||||
- [ ] Real LND gRPC backend for `PaymentHandler` (replace mock)
|
- [ ] Real LND gRPC backend for `PaymentHandler` (replace mock)
|
||||||
- [ ] MCP tool integration for Timmy
|
- [ ] MCP tool integration for Timmy
|
||||||
- [ ] Marketplace frontend — wire up the existing `/marketplace` route to real data
|
- [ ] Marketplace frontend — wire `/marketplace` route to real data
|
||||||
- [ ] Persistent swarm state across restarts (currently in-memory)
|
|
||||||
|
|
||||||
**v3.0.0 — Revelation (planned)**
|
**v3.0.0 — Revelation (planned)**
|
||||||
- [ ] Bitcoin Lightning treasury (agent earns and spends sats autonomously)
|
- [ ] Bitcoin Lightning treasury (agent earns and spends sats autonomously)
|
||||||
- [ ] Single `.app` bundle for macOS (no Python install required)
|
- [ ] Single `.app` bundle for macOS (no Python install required)
|
||||||
- [ ] Federation — multiple Timmy instances discover and bid on each other's tasks
|
- [ ] Federation — multiple Timmy instances discover and bid on each other's tasks
|
||||||
|
- [ ] Redis pub/sub replacing SQLite polling for high-throughput swarms
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 7. File Conventions
|
## 8. File Conventions
|
||||||
|
|
||||||
| Pattern | Convention |
|
| Pattern | Convention |
|
||||||
|---------|-----------|
|
|---------|-----------|
|
||||||
@@ -224,3 +333,4 @@ These are unbuilt items — claim one per PR, coordinate via Issues:
|
|||||||
| New test file | `tests/test_<module>.py` |
|
| New test file | `tests/test_<module>.py` |
|
||||||
| Secrets | Read via `os.environ.get("VAR", "default")` + startup warning if default |
|
| Secrets | Read via `os.environ.get("VAR", "default")` + startup warning if default |
|
||||||
| DB files | `.db` files go in project root or `data/` — never in `src/` |
|
| DB files | `.db` files go in project root or `data/` — never in `src/` |
|
||||||
|
| Docker | One service per agent type in `docker-compose.yml` |
|
||||||
|
|||||||
58
Dockerfile
Normal file
58
Dockerfile
Normal file
@@ -0,0 +1,58 @@
|
|||||||
|
# ── Timmy Time — agent image ────────────────────────────────────────────────
|
||||||
|
#
|
||||||
|
# Serves two purposes:
|
||||||
|
# 1. `make docker-up` → runs the FastAPI dashboard (default CMD)
|
||||||
|
# 2. `make docker-agent` → runs a swarm agent worker (override CMD)
|
||||||
|
#
|
||||||
|
# Build: docker build -t timmy-time:latest .
|
||||||
|
# Dash: docker run -p 8000:8000 -v $(pwd)/data:/app/data timmy-time:latest
|
||||||
|
# Agent: docker run -e COORDINATOR_URL=http://dashboard:8000 \
|
||||||
|
# -e AGENT_NAME=Worker-1 \
|
||||||
|
# timmy-time:latest \
|
||||||
|
# python -m swarm.agent_runner --agent-id w1 --name Worker-1
|
||||||
|
|
||||||
|
FROM python:3.12-slim
|
||||||
|
|
||||||
|
# ── System deps ──────────────────────────────────────────────────────────────
|
||||||
|
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||||
|
gcc curl \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# ── Python deps (install before copying src for layer caching) ───────────────
|
||||||
|
COPY pyproject.toml .
|
||||||
|
|
||||||
|
# Install production deps only (no dev/test extras in the image)
|
||||||
|
RUN pip install --no-cache-dir \
|
||||||
|
"fastapi>=0.115.0" \
|
||||||
|
"uvicorn[standard]>=0.32.0" \
|
||||||
|
"jinja2>=3.1.0" \
|
||||||
|
"httpx>=0.27.0" \
|
||||||
|
"python-multipart>=0.0.12" \
|
||||||
|
"aiofiles>=24.0.0" \
|
||||||
|
"typer>=0.12.0" \
|
||||||
|
"rich>=13.0.0" \
|
||||||
|
"pydantic-settings>=2.0.0" \
|
||||||
|
"websockets>=12.0" \
|
||||||
|
"agno[sqlite]>=1.4.0" \
|
||||||
|
"ollama>=0.3.0" \
|
||||||
|
"openai>=1.0.0" \
|
||||||
|
"python-telegram-bot>=21.0"
|
||||||
|
|
||||||
|
# ── Application source ───────────────────────────────────────────────────────
|
||||||
|
COPY src/ ./src/
|
||||||
|
COPY static/ ./static/
|
||||||
|
|
||||||
|
# Create data directory (mounted as a volume in production)
|
||||||
|
RUN mkdir -p /app/data
|
||||||
|
|
||||||
|
# ── Environment ──────────────────────────────────────────────────────────────
|
||||||
|
ENV PYTHONPATH=/app/src
|
||||||
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE=1
|
||||||
|
|
||||||
|
EXPOSE 8000
|
||||||
|
|
||||||
|
# ── Default: run the dashboard ───────────────────────────────────────────────
|
||||||
|
CMD ["uvicorn", "dashboard.app:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||||
37
Makefile
37
Makefile
@@ -1,4 +1,5 @@
|
|||||||
.PHONY: install install-bigbrain dev test test-cov watch lint clean help
|
.PHONY: install install-bigbrain dev test test-cov watch lint clean help \
|
||||||
|
docker-build docker-up docker-down docker-agent docker-logs docker-shell
|
||||||
|
|
||||||
VENV := .venv
|
VENV := .venv
|
||||||
PYTHON := $(VENV)/bin/python
|
PYTHON := $(VENV)/bin/python
|
||||||
@@ -65,6 +66,33 @@ lint:
|
|||||||
|
|
||||||
# ── Housekeeping ──────────────────────────────────────────────────────────────
|
# ── Housekeeping ──────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
# ── Docker ────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
docker-build:
|
||||||
|
docker build -t timmy-time:latest .
|
||||||
|
|
||||||
|
docker-up:
|
||||||
|
mkdir -p data
|
||||||
|
docker compose up -d dashboard
|
||||||
|
|
||||||
|
docker-down:
|
||||||
|
docker compose down
|
||||||
|
|
||||||
|
# Spawn one agent worker connected to the running dashboard.
|
||||||
|
# Override name/capabilities: make docker-agent AGENT_NAME=Echo AGENT_CAPABILITIES=summarise
|
||||||
|
docker-agent:
|
||||||
|
AGENT_NAME=$${AGENT_NAME:-Worker} \
|
||||||
|
AGENT_CAPABILITIES=$${AGENT_CAPABILITIES:-general} \
|
||||||
|
docker compose --profile agents up -d --scale agent=1 agent
|
||||||
|
|
||||||
|
docker-logs:
|
||||||
|
docker compose logs -f
|
||||||
|
|
||||||
|
docker-shell:
|
||||||
|
docker compose exec dashboard bash
|
||||||
|
|
||||||
|
# ── Housekeeping ──────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
clean:
|
clean:
|
||||||
find . -type d -name __pycache__ -exec rm -rf {} + 2>/dev/null || true
|
find . -type d -name __pycache__ -exec rm -rf {} + 2>/dev/null || true
|
||||||
find . -type d -name "*.egg-info" -exec rm -rf {} + 2>/dev/null || true
|
find . -type d -name "*.egg-info" -exec rm -rf {} + 2>/dev/null || true
|
||||||
@@ -83,3 +111,10 @@ help:
|
|||||||
@echo " make lint run ruff or flake8"
|
@echo " make lint run ruff or flake8"
|
||||||
@echo " make clean remove build artefacts and caches"
|
@echo " make clean remove build artefacts and caches"
|
||||||
@echo ""
|
@echo ""
|
||||||
|
@echo " make docker-build build the timmy-time:latest image"
|
||||||
|
@echo " make docker-up start dashboard container"
|
||||||
|
@echo " make docker-agent add one agent worker (AGENT_NAME=Echo)"
|
||||||
|
@echo " make docker-down stop all containers"
|
||||||
|
@echo " make docker-logs tail container logs"
|
||||||
|
@echo " make docker-shell open a bash shell in the dashboard container"
|
||||||
|
@echo ""
|
||||||
|
|||||||
109
docker-compose.yml
Normal file
109
docker-compose.yml
Normal file
@@ -0,0 +1,109 @@
|
|||||||
|
# ── Timmy Time — docker-compose ─────────────────────────────────────────────
|
||||||
|
#
|
||||||
|
# Services
|
||||||
|
# dashboard FastAPI app + swarm coordinator (always on)
|
||||||
|
# agent Swarm worker template — scale with:
|
||||||
|
# docker compose up --scale agent=N --profile agents
|
||||||
|
#
|
||||||
|
# Volumes
|
||||||
|
# timmy-data Shared SQLite (data/swarm.db + data/timmy.db)
|
||||||
|
#
|
||||||
|
# Usage
|
||||||
|
# make docker-build build the image
|
||||||
|
# make docker-up start dashboard only
|
||||||
|
# make docker-agent add one agent worker
|
||||||
|
# make docker-down stop everything
|
||||||
|
# make docker-logs tail logs
|
||||||
|
|
||||||
|
version: "3.9"
|
||||||
|
|
||||||
|
services:
|
||||||
|
|
||||||
|
# ── Dashboard (coordinator + FastAPI) ──────────────────────────────────────
|
||||||
|
dashboard:
|
||||||
|
build: .
|
||||||
|
image: timmy-time:latest
|
||||||
|
container_name: timmy-dashboard
|
||||||
|
ports:
|
||||||
|
- "8000:8000"
|
||||||
|
volumes:
|
||||||
|
- timmy-data:/app/data
|
||||||
|
- ./src:/app/src # live-reload: source changes reflect immediately
|
||||||
|
- ./static:/app/static # live-reload: CSS/asset changes reflect immediately
|
||||||
|
environment:
|
||||||
|
DEBUG: "true"
|
||||||
|
# Point to host Ollama (Mac default). Override in .env if different.
|
||||||
|
OLLAMA_URL: "${OLLAMA_URL:-http://host.docker.internal:11434}"
|
||||||
|
extra_hosts:
|
||||||
|
- "host.docker.internal:host-gateway" # Linux compatibility
|
||||||
|
networks:
|
||||||
|
- swarm-net
|
||||||
|
restart: unless-stopped
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 3
|
||||||
|
start_period: 10s
|
||||||
|
|
||||||
|
# ── Timmy — sovereign AI agent (separate container) ───────────────────────
|
||||||
|
timmy:
|
||||||
|
build: .
|
||||||
|
image: timmy-time:latest
|
||||||
|
container_name: timmy-agent
|
||||||
|
volumes:
|
||||||
|
- timmy-data:/app/data
|
||||||
|
- ./src:/app/src
|
||||||
|
environment:
|
||||||
|
COORDINATOR_URL: "http://dashboard:8000"
|
||||||
|
OLLAMA_URL: "${OLLAMA_URL:-http://host.docker.internal:11434}"
|
||||||
|
TIMMY_AGENT_ID: "timmy"
|
||||||
|
extra_hosts:
|
||||||
|
- "host.docker.internal:host-gateway"
|
||||||
|
command: ["python", "-m", "timmy.docker_agent"]
|
||||||
|
networks:
|
||||||
|
- swarm-net
|
||||||
|
depends_on:
|
||||||
|
dashboard:
|
||||||
|
condition: service_healthy
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
# ── Agent worker template ───────────────────────────────────────────────────
|
||||||
|
# Scale horizontally: docker compose up --scale agent=4 --profile agents
|
||||||
|
# Each container gets a unique AGENT_ID via the replica index.
|
||||||
|
agent:
|
||||||
|
build: .
|
||||||
|
image: timmy-time:latest
|
||||||
|
profiles:
|
||||||
|
- agents
|
||||||
|
volumes:
|
||||||
|
- timmy-data:/app/data
|
||||||
|
- ./src:/app/src
|
||||||
|
environment:
|
||||||
|
COORDINATOR_URL: "http://dashboard:8000"
|
||||||
|
OLLAMA_URL: "${OLLAMA_URL:-http://host.docker.internal:11434}"
|
||||||
|
AGENT_NAME: "${AGENT_NAME:-Worker}"
|
||||||
|
AGENT_CAPABILITIES: "${AGENT_CAPABILITIES:-general}"
|
||||||
|
extra_hosts:
|
||||||
|
- "host.docker.internal:host-gateway"
|
||||||
|
command: ["sh", "-c", "python -m swarm.agent_runner --agent-id agent-$(hostname) --name $${AGENT_NAME:-Worker}"]
|
||||||
|
networks:
|
||||||
|
- swarm-net
|
||||||
|
depends_on:
|
||||||
|
dashboard:
|
||||||
|
condition: service_healthy
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
# ── Shared volume ─────────────────────────────────────────────────────────────
|
||||||
|
volumes:
|
||||||
|
timmy-data:
|
||||||
|
driver: local
|
||||||
|
driver_opts:
|
||||||
|
type: none
|
||||||
|
o: bind
|
||||||
|
device: "${PWD}/data"
|
||||||
|
|
||||||
|
# ── Internal network ──────────────────────────────────────────────────────────
|
||||||
|
networks:
|
||||||
|
swarm-net:
|
||||||
|
driver: bridge
|
||||||
@@ -20,6 +20,7 @@ from dashboard.routes.mobile import router as mobile_router
|
|||||||
from dashboard.routes.swarm_ws import router as swarm_ws_router
|
from dashboard.routes.swarm_ws import router as swarm_ws_router
|
||||||
from dashboard.routes.briefing import router as briefing_router
|
from dashboard.routes.briefing import router as briefing_router
|
||||||
from dashboard.routes.telegram import router as telegram_router
|
from dashboard.routes.telegram import router as telegram_router
|
||||||
|
from dashboard.routes.swarm_internal import router as swarm_internal_router
|
||||||
|
|
||||||
logging.basicConfig(
|
logging.basicConfig(
|
||||||
level=logging.INFO,
|
level=logging.INFO,
|
||||||
@@ -64,6 +65,24 @@ async def _briefing_scheduler() -> None:
|
|||||||
async def lifespan(app: FastAPI):
|
async def lifespan(app: FastAPI):
|
||||||
task = asyncio.create_task(_briefing_scheduler())
|
task = asyncio.create_task(_briefing_scheduler())
|
||||||
|
|
||||||
|
# Register Timmy in the swarm registry so it shows up alongside other agents
|
||||||
|
from swarm import registry as swarm_registry
|
||||||
|
swarm_registry.register(
|
||||||
|
name="Timmy",
|
||||||
|
capabilities="chat,reasoning,research,planning",
|
||||||
|
agent_id="timmy",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Log swarm recovery summary (reconciliation ran during coordinator init)
|
||||||
|
from swarm.coordinator import coordinator as swarm_coordinator
|
||||||
|
rec = swarm_coordinator._recovery_summary
|
||||||
|
if rec["tasks_failed"] or rec["agents_offlined"]:
|
||||||
|
logger.info(
|
||||||
|
"Swarm recovery on startup: %d task(s) → FAILED, %d agent(s) → offline",
|
||||||
|
rec["tasks_failed"],
|
||||||
|
rec["agents_offlined"],
|
||||||
|
)
|
||||||
|
|
||||||
# Auto-start Telegram bot if a token is configured
|
# Auto-start Telegram bot if a token is configured
|
||||||
from telegram_bot.bot import telegram_bot
|
from telegram_bot.bot import telegram_bot
|
||||||
await telegram_bot.start()
|
await telegram_bot.start()
|
||||||
@@ -101,6 +120,7 @@ app.include_router(mobile_router)
|
|||||||
app.include_router(swarm_ws_router)
|
app.include_router(swarm_ws_router)
|
||||||
app.include_router(briefing_router)
|
app.include_router(briefing_router)
|
||||||
app.include_router(telegram_router)
|
app.include_router(telegram_router)
|
||||||
|
app.include_router(swarm_internal_router)
|
||||||
|
|
||||||
|
|
||||||
@app.get("/", response_class=HTMLResponse)
|
@app.get("/", response_class=HTMLResponse)
|
||||||
|
|||||||
@@ -11,21 +11,42 @@ from dashboard.store import message_log
|
|||||||
router = APIRouter(prefix="/agents", tags=["agents"])
|
router = APIRouter(prefix="/agents", tags=["agents"])
|
||||||
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
|
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
|
||||||
|
|
||||||
AGENT_REGISTRY = {
|
# Static metadata for known agents — enriched onto live registry entries.
|
||||||
|
_AGENT_METADATA: dict[str, dict] = {
|
||||||
"timmy": {
|
"timmy": {
|
||||||
"id": "timmy",
|
|
||||||
"name": "Timmy",
|
|
||||||
"type": "sovereign",
|
"type": "sovereign",
|
||||||
"model": "llama3.2",
|
"model": "llama3.2",
|
||||||
"backend": "ollama",
|
"backend": "ollama",
|
||||||
"version": "1.0.0",
|
"version": "1.0.0",
|
||||||
}
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@router.get("")
|
@router.get("")
|
||||||
async def list_agents():
|
async def list_agents():
|
||||||
return {"agents": list(AGENT_REGISTRY.values())}
|
"""Return all registered agents with live status from the swarm registry."""
|
||||||
|
from swarm import registry as swarm_registry
|
||||||
|
agents = swarm_registry.list_agents()
|
||||||
|
return {
|
||||||
|
"agents": [
|
||||||
|
{
|
||||||
|
"id": a.id,
|
||||||
|
"name": a.name,
|
||||||
|
"status": a.status,
|
||||||
|
"capabilities": a.capabilities,
|
||||||
|
**_AGENT_METADATA.get(a.id, {}),
|
||||||
|
}
|
||||||
|
for a in agents
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/timmy/panel", response_class=HTMLResponse)
|
||||||
|
async def timmy_panel(request: Request):
|
||||||
|
"""Timmy chat panel — for HTMX main-panel swaps."""
|
||||||
|
from swarm import registry as swarm_registry
|
||||||
|
agent = swarm_registry.get_agent("timmy")
|
||||||
|
return templates.TemplateResponse(request, "partials/timmy_panel.html", {"agent": agent})
|
||||||
|
|
||||||
|
|
||||||
@router.get("/timmy/history", response_class=HTMLResponse)
|
@router.get("/timmy/history", response_class=HTMLResponse)
|
||||||
|
|||||||
@@ -73,7 +73,9 @@ def _build_enriched_catalog() -> list[dict]:
|
|||||||
reg = by_name.get(e["name"].lower())
|
reg = by_name.get(e["name"].lower())
|
||||||
|
|
||||||
if reg is not None:
|
if reg is not None:
|
||||||
e["status"] = reg.status # idle | busy | offline
|
# Timmy is always "active" in the marketplace — it's the sovereign
|
||||||
|
# agent, not just a task worker. Registry idle/busy is internal state.
|
||||||
|
e["status"] = "active" if e["id"] == "timmy" else reg.status
|
||||||
agent_stats = all_stats.get(reg.id, {})
|
agent_stats = all_stats.get(reg.id, {})
|
||||||
e["tasks_completed"] = agent_stats.get("tasks_won", 0)
|
e["tasks_completed"] = agent_stats.get("tasks_won", 0)
|
||||||
e["total_earned"] = agent_stats.get("total_earned", 0)
|
e["total_earned"] = agent_stats.get("total_earned", 0)
|
||||||
@@ -97,9 +99,9 @@ async def marketplace_ui(request: Request):
|
|||||||
active = [a for a in agents if a["status"] in ("idle", "busy", "active")]
|
active = [a for a in agents if a["status"] in ("idle", "busy", "active")]
|
||||||
planned = [a for a in agents if a["status"] == "planned"]
|
planned = [a for a in agents if a["status"] == "planned"]
|
||||||
return templates.TemplateResponse(
|
return templates.TemplateResponse(
|
||||||
|
request,
|
||||||
"marketplace.html",
|
"marketplace.html",
|
||||||
{
|
{
|
||||||
"request": request,
|
|
||||||
"page_title": "Agent Marketplace",
|
"page_title": "Agent Marketplace",
|
||||||
"agents": agents,
|
"agents": agents,
|
||||||
"active_count": len(active),
|
"active_count": len(active),
|
||||||
|
|||||||
@@ -4,15 +4,17 @@ Provides REST endpoints for managing the swarm: listing agents,
|
|||||||
spawning sub-agents, posting tasks, and viewing auction results.
|
spawning sub-agents, posting tasks, and viewing auction results.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
from datetime import datetime, timezone
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
from fastapi import APIRouter, Form, Request
|
from fastapi import APIRouter, Form, HTTPException, Request
|
||||||
from fastapi.responses import HTMLResponse
|
from fastapi.responses import HTMLResponse
|
||||||
from fastapi.templating import Jinja2Templates
|
from fastapi.templating import Jinja2Templates
|
||||||
|
|
||||||
|
from swarm import registry
|
||||||
from swarm.coordinator import coordinator
|
from swarm.coordinator import coordinator
|
||||||
from swarm.tasks import TaskStatus
|
from swarm.tasks import TaskStatus, update_task
|
||||||
|
|
||||||
router = APIRouter(prefix="/swarm", tags=["swarm"])
|
router = APIRouter(prefix="/swarm", tags=["swarm"])
|
||||||
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
|
templates = Jinja2Templates(directory=str(Path(__file__).parent.parent / "templates"))
|
||||||
@@ -28,8 +30,7 @@ async def swarm_status():
|
|||||||
async def swarm_live_page(request: Request):
|
async def swarm_live_page(request: Request):
|
||||||
"""Render the live swarm dashboard page."""
|
"""Render the live swarm dashboard page."""
|
||||||
return templates.TemplateResponse(
|
return templates.TemplateResponse(
|
||||||
"swarm_live.html",
|
request, "swarm_live.html", {"page_title": "Swarm Live"}
|
||||||
{"request": request, "page_title": "Swarm Live"},
|
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@@ -127,3 +128,137 @@ async def get_task(task_id: str):
|
|||||||
"created_at": task.created_at,
|
"created_at": task.created_at,
|
||||||
"completed_at": task.completed_at,
|
"completed_at": task.completed_at,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/tasks/{task_id}/complete")
|
||||||
|
async def complete_task(task_id: str, result: str = Form(...)):
|
||||||
|
"""Mark a task completed — called by agent containers."""
|
||||||
|
task = coordinator.complete_task(task_id, result)
|
||||||
|
if task is None:
|
||||||
|
raise HTTPException(404, "Task not found")
|
||||||
|
return {"task_id": task_id, "status": task.status.value}
|
||||||
|
|
||||||
|
|
||||||
|
# ── UI endpoints (return HTML partials for HTMX) ─────────────────────────────
|
||||||
|
|
||||||
|
@router.get("/agents/sidebar", response_class=HTMLResponse)
|
||||||
|
async def agents_sidebar(request: Request):
|
||||||
|
"""Sidebar partial: all registered agents."""
|
||||||
|
agents = coordinator.list_swarm_agents()
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
request, "partials/swarm_agents_sidebar.html", {"agents": agents}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/agents/{agent_id}/panel", response_class=HTMLResponse)
|
||||||
|
async def agent_panel(agent_id: str, request: Request):
|
||||||
|
"""Main-panel partial: agent detail + chat + task history."""
|
||||||
|
agent = registry.get_agent(agent_id)
|
||||||
|
if agent is None:
|
||||||
|
raise HTTPException(404, "Agent not found")
|
||||||
|
all_tasks = coordinator.list_tasks()
|
||||||
|
agent_tasks = [t for t in all_tasks if t.assigned_agent == agent_id][-10:]
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
request,
|
||||||
|
"partials/agent_panel.html",
|
||||||
|
{"agent": agent, "tasks": agent_tasks},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/agents/{agent_id}/message", response_class=HTMLResponse)
|
||||||
|
async def message_agent(agent_id: str, request: Request, message: str = Form(...)):
|
||||||
|
"""Send a direct message to an agent (creates + assigns a task)."""
|
||||||
|
agent = registry.get_agent(agent_id)
|
||||||
|
if agent is None:
|
||||||
|
raise HTTPException(404, "Agent not found")
|
||||||
|
|
||||||
|
timestamp = datetime.now(timezone.utc).strftime("%H:%M:%S")
|
||||||
|
|
||||||
|
# Timmy: route through his AI backend
|
||||||
|
if agent_id == "timmy":
|
||||||
|
result_text = error_text = None
|
||||||
|
try:
|
||||||
|
from timmy.agent import create_timmy
|
||||||
|
run = create_timmy().run(message, stream=False)
|
||||||
|
result_text = run.content if hasattr(run, "content") else str(run)
|
||||||
|
except Exception as exc:
|
||||||
|
error_text = f"Timmy is offline: {exc}"
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
request,
|
||||||
|
"partials/agent_chat_msg.html",
|
||||||
|
{
|
||||||
|
"message": message,
|
||||||
|
"agent": agent,
|
||||||
|
"response": result_text,
|
||||||
|
"error": error_text,
|
||||||
|
"timestamp": timestamp,
|
||||||
|
"task_id": None,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Other agents: create a task and assign directly
|
||||||
|
task = coordinator.post_task(message)
|
||||||
|
coordinator.auctions.open_auction(task.id)
|
||||||
|
coordinator.auctions.submit_bid(task.id, agent_id, 1)
|
||||||
|
coordinator.auctions.close_auction(task.id)
|
||||||
|
update_task(task.id, status=TaskStatus.ASSIGNED, assigned_agent=agent_id)
|
||||||
|
registry.update_status(agent_id, "busy")
|
||||||
|
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
request,
|
||||||
|
"partials/agent_chat_msg.html",
|
||||||
|
{
|
||||||
|
"message": message,
|
||||||
|
"agent": agent,
|
||||||
|
"response": None,
|
||||||
|
"error": None,
|
||||||
|
"timestamp": timestamp,
|
||||||
|
"task_id": task.id,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/tasks/panel", response_class=HTMLResponse)
|
||||||
|
async def task_create_panel(request: Request, agent_id: Optional[str] = None):
|
||||||
|
"""Task creation panel, optionally pre-selecting an agent."""
|
||||||
|
agents = coordinator.list_swarm_agents()
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
request,
|
||||||
|
"partials/task_assign_panel.html",
|
||||||
|
{"agents": agents, "preselected_agent_id": agent_id},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/tasks/direct", response_class=HTMLResponse)
|
||||||
|
async def direct_assign_task(
|
||||||
|
request: Request,
|
||||||
|
description: str = Form(...),
|
||||||
|
agent_id: Optional[str] = Form(None),
|
||||||
|
):
|
||||||
|
"""Create a task: assign directly if agent_id given, else open auction."""
|
||||||
|
timestamp = datetime.now(timezone.utc).strftime("%H:%M:%S")
|
||||||
|
|
||||||
|
if agent_id:
|
||||||
|
agent = registry.get_agent(agent_id)
|
||||||
|
task = coordinator.post_task(description)
|
||||||
|
coordinator.auctions.open_auction(task.id)
|
||||||
|
coordinator.auctions.submit_bid(task.id, agent_id, 1)
|
||||||
|
coordinator.auctions.close_auction(task.id)
|
||||||
|
update_task(task.id, status=TaskStatus.ASSIGNED, assigned_agent=agent_id)
|
||||||
|
registry.update_status(agent_id, "busy")
|
||||||
|
agent_name = agent.name if agent else agent_id
|
||||||
|
else:
|
||||||
|
task = coordinator.post_task(description)
|
||||||
|
winner = await coordinator.run_auction_and_assign(task.id)
|
||||||
|
task = coordinator.get_task(task.id)
|
||||||
|
agent_name = winner.agent_id if winner else "unassigned"
|
||||||
|
|
||||||
|
return templates.TemplateResponse(
|
||||||
|
request,
|
||||||
|
"partials/task_result.html",
|
||||||
|
{
|
||||||
|
"task": task,
|
||||||
|
"agent_name": agent_name,
|
||||||
|
"timestamp": timestamp,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|||||||
115
src/dashboard/routes/swarm_internal.py
Normal file
115
src/dashboard/routes/swarm_internal.py
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
"""Internal swarm HTTP API — for Docker container agents.
|
||||||
|
|
||||||
|
Container agents can't use the in-memory SwarmComms channel, so they poll
|
||||||
|
these lightweight endpoints to participate in the auction system.
|
||||||
|
|
||||||
|
Routes
|
||||||
|
------
|
||||||
|
GET /internal/tasks
|
||||||
|
Returns all tasks currently in BIDDING status — the set an agent
|
||||||
|
can submit bids for.
|
||||||
|
|
||||||
|
POST /internal/bids
|
||||||
|
Accepts a bid from a container agent and feeds it into the in-memory
|
||||||
|
AuctionManager. The coordinator then closes auctions and assigns
|
||||||
|
winners exactly as it does for in-process agents.
|
||||||
|
|
||||||
|
These endpoints are intentionally unauthenticated because they are only
|
||||||
|
reachable inside the Docker swarm-net bridge network. Do not expose them
|
||||||
|
through a reverse-proxy to the public internet.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
from fastapi import APIRouter, HTTPException
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
from swarm.coordinator import coordinator
|
||||||
|
from swarm.tasks import TaskStatus
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/internal", tags=["internal"])
|
||||||
|
|
||||||
|
|
||||||
|
# ── Request / response models ─────────────────────────────────────────────────
|
||||||
|
|
||||||
|
class BidRequest(BaseModel):
|
||||||
|
task_id: str
|
||||||
|
agent_id: str
|
||||||
|
bid_sats: int
|
||||||
|
capabilities: Optional[str] = ""
|
||||||
|
|
||||||
|
|
||||||
|
class BidResponse(BaseModel):
|
||||||
|
accepted: bool
|
||||||
|
task_id: str
|
||||||
|
agent_id: str
|
||||||
|
message: str
|
||||||
|
|
||||||
|
|
||||||
|
class TaskSummary(BaseModel):
|
||||||
|
task_id: str
|
||||||
|
description: str
|
||||||
|
status: str
|
||||||
|
|
||||||
|
|
||||||
|
# ── Routes ────────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
@router.get("/tasks", response_model=list[TaskSummary])
|
||||||
|
def list_biddable_tasks():
|
||||||
|
"""Return all tasks currently open for bidding.
|
||||||
|
|
||||||
|
Container agents should poll this endpoint and submit bids for any
|
||||||
|
tasks they are capable of handling.
|
||||||
|
"""
|
||||||
|
tasks = coordinator.list_tasks(status=TaskStatus.BIDDING)
|
||||||
|
return [
|
||||||
|
TaskSummary(
|
||||||
|
task_id=t.id,
|
||||||
|
description=t.description,
|
||||||
|
status=t.status.value,
|
||||||
|
)
|
||||||
|
for t in tasks
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/bids", response_model=BidResponse)
|
||||||
|
def submit_bid(bid: BidRequest):
|
||||||
|
"""Accept a bid from a container agent.
|
||||||
|
|
||||||
|
The bid is injected directly into the in-memory AuctionManager.
|
||||||
|
If no auction is open for the task (e.g. it already closed), the
|
||||||
|
bid is rejected gracefully — the agent should just move on.
|
||||||
|
"""
|
||||||
|
if bid.bid_sats <= 0:
|
||||||
|
raise HTTPException(status_code=422, detail="bid_sats must be > 0")
|
||||||
|
|
||||||
|
accepted = coordinator.auctions.submit_bid(
|
||||||
|
task_id=bid.task_id,
|
||||||
|
agent_id=bid.agent_id,
|
||||||
|
bid_sats=bid.bid_sats,
|
||||||
|
)
|
||||||
|
|
||||||
|
if accepted:
|
||||||
|
# Persist bid in stats table for marketplace analytics
|
||||||
|
from swarm import stats as swarm_stats
|
||||||
|
swarm_stats.record_bid(bid.task_id, bid.agent_id, bid.bid_sats, won=False)
|
||||||
|
logger.info(
|
||||||
|
"Docker agent %s bid %d sats on task %s",
|
||||||
|
bid.agent_id, bid.bid_sats, bid.task_id,
|
||||||
|
)
|
||||||
|
return BidResponse(
|
||||||
|
accepted=True,
|
||||||
|
task_id=bid.task_id,
|
||||||
|
agent_id=bid.agent_id,
|
||||||
|
message="Bid accepted.",
|
||||||
|
)
|
||||||
|
|
||||||
|
return BidResponse(
|
||||||
|
accepted=False,
|
||||||
|
task_id=bid.task_id,
|
||||||
|
agent_id=bid.agent_id,
|
||||||
|
message="No open auction for this task — it may have already closed.",
|
||||||
|
)
|
||||||
@@ -5,13 +5,13 @@
|
|||||||
<meta name="viewport" content="width=device-width, initial-scale=1.0, viewport-fit=cover" />
|
<meta name="viewport" content="width=device-width, initial-scale=1.0, viewport-fit=cover" />
|
||||||
<meta name="apple-mobile-web-app-capable" content="yes" />
|
<meta name="apple-mobile-web-app-capable" content="yes" />
|
||||||
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent" />
|
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent" />
|
||||||
<meta name="theme-color" content="#060d14" />
|
<meta name="theme-color" content="#080412" />
|
||||||
<title>{% block title %}Timmy Time — Mission Control{% endblock %}</title>
|
<title>{% block title %}Timmy Time — Mission Control{% endblock %}</title>
|
||||||
<link rel="preconnect" href="https://fonts.googleapis.com" />
|
<link rel="preconnect" href="https://fonts.googleapis.com" />
|
||||||
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
|
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
|
||||||
<link href="https://fonts.googleapis.com/css2?family=JetBrains+Mono:wght@300;400;500;700&display=swap" rel="stylesheet" />
|
<link href="https://fonts.googleapis.com/css2?family=JetBrains+Mono:wght@300;400;500;700&display=swap" rel="stylesheet" />
|
||||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-QWTKZyjpPEjISv5WaRU9OFeRpok6YctnYmDr5pNlyT2bRjXh0JMhjY6hW+ALEwIH" crossorigin="anonymous" />
|
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/css/bootstrap.min.css" rel="stylesheet" integrity="sha384-QWTKZyjpPEjISv5WaRU9OFeRpok6YctnYmDr5pNlyT2bRjXh0JMhjY6hW+ALEwIH" crossorigin="anonymous" />
|
||||||
<link rel="stylesheet" href="/static/style.css" />
|
<link rel="stylesheet" href="/static/style.css?v=2" />
|
||||||
<script src="https://unpkg.com/htmx.org@2.0.3" integrity="sha384-0895/pl2MU10Hqc6jd4RvrthNlDiE9U1tWmX7WRESftEDRosgxNsQG/Ze9YMRzHq" crossorigin="anonymous"></script>
|
<script src="https://unpkg.com/htmx.org@2.0.3" integrity="sha384-0895/pl2MU10Hqc6jd4RvrthNlDiE9U1tWmX7WRESftEDRosgxNsQG/Ze9YMRzHq" crossorigin="anonymous"></script>
|
||||||
</head>
|
</head>
|
||||||
<body>
|
<body>
|
||||||
|
|||||||
@@ -8,22 +8,15 @@
|
|||||||
<!-- Sidebar -->
|
<!-- Sidebar -->
|
||||||
<div class="col-12 col-md-3 d-flex flex-column gap-3 mc-sidebar">
|
<div class="col-12 col-md-3 d-flex flex-column gap-3 mc-sidebar">
|
||||||
|
|
||||||
<!-- Agents -->
|
<!-- Agents (HTMX-polled from registry) -->
|
||||||
<div class="card mc-panel">
|
<div class="card mc-panel"
|
||||||
|
hx-get="/swarm/agents/sidebar"
|
||||||
|
hx-trigger="load, every 10s"
|
||||||
|
hx-target="this"
|
||||||
|
hx-swap="innerHTML">
|
||||||
<div class="card-header mc-panel-header">// AGENTS</div>
|
<div class="card-header mc-panel-header">// AGENTS</div>
|
||||||
<div class="card-body p-3">
|
<div class="card-body p-3">
|
||||||
<div class="mc-agent-card">
|
<div style="font-size:11px; color:var(--text-dim); letter-spacing:.08em;">LOADING...</div>
|
||||||
<div class="d-flex align-items-center gap-2 mb-2">
|
|
||||||
<span class="status-dot amber"></span>
|
|
||||||
<span class="agent-name">TIMMY</span>
|
|
||||||
</div>
|
|
||||||
<div class="agent-meta">
|
|
||||||
<span class="meta-key">TYPE</span> <span class="meta-val">sovereign</span><br>
|
|
||||||
<span class="meta-key">MODEL</span> <span class="meta-val">llama3.2</span><br>
|
|
||||||
<span class="meta-key">BACKEND</span> <span class="meta-val">ollama</span><br>
|
|
||||||
<span class="meta-key">VERSION</span> <span class="meta-val">1.0.0</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
@@ -43,49 +36,13 @@
|
|||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- Chat Panel -->
|
<!-- Main panel — swappable via HTMX; defaults to Timmy on load -->
|
||||||
<div class="col-12 col-md-9 d-flex flex-column mc-chat-panel">
|
<div id="main-panel"
|
||||||
<div class="card mc-panel flex-grow-1 d-flex flex-column min-h-0">
|
class="col-12 col-md-9 d-flex flex-column mc-chat-panel"
|
||||||
<div class="card-header mc-panel-header d-flex justify-content-between align-items-center">
|
hx-get="/agents/timmy/panel"
|
||||||
<span>// TIMMY INTERFACE</span>
|
hx-trigger="load"
|
||||||
<button class="mc-btn-clear"
|
hx-target="#main-panel"
|
||||||
hx-delete="/agents/timmy/history"
|
hx-swap="outerHTML">
|
||||||
hx-target="#chat-log"
|
|
||||||
hx-swap="innerHTML"
|
|
||||||
hx-confirm="Clear conversation history?">CLEAR</button>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<div class="chat-log flex-grow-1 overflow-auto p-3" id="chat-log"
|
|
||||||
hx-get="/agents/timmy/history"
|
|
||||||
hx-trigger="load"
|
|
||||||
hx-swap="innerHTML"></div>
|
|
||||||
|
|
||||||
<div class="card-footer mc-chat-footer">
|
|
||||||
<form hx-post="/agents/timmy/chat"
|
|
||||||
hx-target="#chat-log"
|
|
||||||
hx-swap="beforeend"
|
|
||||||
hx-indicator="#send-indicator"
|
|
||||||
hx-sync="this:drop"
|
|
||||||
hx-disabled-elt="find button"
|
|
||||||
hx-on::after-settle="this.reset(); scrollChat()"
|
|
||||||
class="d-flex gap-2">
|
|
||||||
<input type="text"
|
|
||||||
name="message"
|
|
||||||
class="form-control mc-input"
|
|
||||||
placeholder="send a message to timmy..."
|
|
||||||
autocomplete="off"
|
|
||||||
autocorrect="off"
|
|
||||||
autocapitalize="none"
|
|
||||||
spellcheck="false"
|
|
||||||
enterkeyhint="send"
|
|
||||||
required />
|
|
||||||
<button type="submit" class="btn mc-btn-send">
|
|
||||||
SEND
|
|
||||||
<span id="send-indicator" class="htmx-indicator">▋</span>
|
|
||||||
</button>
|
|
||||||
</form>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
@@ -94,9 +51,12 @@
|
|||||||
<script>
|
<script>
|
||||||
function scrollChat() {
|
function scrollChat() {
|
||||||
const log = document.getElementById('chat-log');
|
const log = document.getElementById('chat-log');
|
||||||
log.scrollTop = log.scrollHeight;
|
if (log) log.scrollTop = log.scrollHeight;
|
||||||
|
}
|
||||||
|
function scrollAgentLog(id) {
|
||||||
|
const log = document.getElementById('agent-log-' + id);
|
||||||
|
if (log) log.scrollTop = log.scrollHeight;
|
||||||
}
|
}
|
||||||
scrollChat();
|
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|||||||
27
src/dashboard/templates/partials/agent_chat_msg.html
Normal file
27
src/dashboard/templates/partials/agent_chat_msg.html
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
<div class="chat-message user">
|
||||||
|
<div class="msg-meta" style="color:var(--orange);">YOU // {{ timestamp }}</div>
|
||||||
|
<div class="msg-body" style="border-color:var(--border-glow);">{{ message }}</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% if response %}
|
||||||
|
<div class="chat-message agent">
|
||||||
|
<div class="msg-meta" style="color:var(--purple);">{{ agent.name | upper }} // {{ timestamp }}</div>
|
||||||
|
<div class="msg-body" style="border-left:3px solid var(--purple);">{{ response }}</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% elif error %}
|
||||||
|
<div class="chat-message error-msg">
|
||||||
|
<div class="msg-meta">SYSTEM // {{ timestamp }}</div>
|
||||||
|
<div class="msg-body" style="border-left:3px solid var(--red); color:var(--red);">{{ error }}</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% elif task_id %}
|
||||||
|
<div class="chat-message agent">
|
||||||
|
<div class="msg-meta" style="color:var(--purple);">{{ agent.name | upper }} // {{ timestamp }}</div>
|
||||||
|
<div class="msg-body" style="border-left:3px solid var(--purple);">
|
||||||
|
<span style="color:var(--text-dim); font-size:11px;">TASK ASSIGNED</span><br>
|
||||||
|
<span style="color:var(--amber);">{{ task_id[:8] }}…</span>
|
||||||
|
<span style="color:var(--text-dim); font-size:11px;"> · awaiting execution</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
82
src/dashboard/templates/partials/agent_panel.html
Normal file
82
src/dashboard/templates/partials/agent_panel.html
Normal file
@@ -0,0 +1,82 @@
|
|||||||
|
{% set dot = "green" if agent.status == "idle" else ("amber" if agent.status == "busy" else "red") %}
|
||||||
|
|
||||||
|
<div id="main-panel" class="col-12 col-md-9 d-flex flex-column mc-chat-panel">
|
||||||
|
<div class="card mc-panel flex-grow-1 d-flex flex-column min-h-0">
|
||||||
|
|
||||||
|
<!-- Header -->
|
||||||
|
<div class="card-header mc-panel-header d-flex justify-content-between align-items-center">
|
||||||
|
<span>
|
||||||
|
<span class="status-dot {{ dot }}" style="margin-right:6px;"></span>
|
||||||
|
// {{ agent.name | upper }}
|
||||||
|
<span style="font-size:10px; color:var(--text-dim); margin-left:10px; letter-spacing:.1em;">
|
||||||
|
{{ agent.capabilities or "no capabilities listed" }}
|
||||||
|
</span>
|
||||||
|
</span>
|
||||||
|
<button class="mc-btn-clear"
|
||||||
|
hx-get="/agents/timmy/panel"
|
||||||
|
hx-target="#main-panel"
|
||||||
|
hx-swap="outerHTML">← TIMMY</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Message log -->
|
||||||
|
<div class="chat-log flex-grow-1 overflow-auto p-3" id="agent-log-{{ agent.id }}">
|
||||||
|
|
||||||
|
{% if tasks %}
|
||||||
|
<div style="font-size:10px; color:var(--text-dim); letter-spacing:.1em; margin-bottom:12px;">
|
||||||
|
RECENT TASKS
|
||||||
|
</div>
|
||||||
|
{% for task in tasks %}
|
||||||
|
<div class="chat-message" style="margin-bottom:10px;">
|
||||||
|
<div class="msg-meta">
|
||||||
|
TASK · {{ task.status.value | upper }} · {{ task.created_at[:19].replace("T"," ") }}
|
||||||
|
</div>
|
||||||
|
<div class="msg-body" style="border-left: 3px solid var(--{% if task.status.value == 'completed' %}green{% elif task.status.value == 'failed' %}red{% else %}orange{% endif %});">
|
||||||
|
<div style="color:var(--text-dim); font-size:11px; margin-bottom:4px;">{{ task.description }}</div>
|
||||||
|
{% if task.result %}
|
||||||
|
<div style="color:var(--text-bright);">{{ task.result }}</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
<hr style="border-color:var(--border); margin:12px 0;">
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
<div id="agent-messages-{{ agent.id }}"></div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Input -->
|
||||||
|
<div class="card-footer mc-chat-footer">
|
||||||
|
<form hx-post="/swarm/agents/{{ agent.id }}/message"
|
||||||
|
hx-target="#agent-messages-{{ agent.id }}"
|
||||||
|
hx-swap="beforeend"
|
||||||
|
hx-indicator="#agent-send-indicator"
|
||||||
|
hx-disabled-elt="find button"
|
||||||
|
hx-on::after-settle="this.reset(); scrollAgentLog('{{ agent.id }}')"
|
||||||
|
class="d-flex gap-2">
|
||||||
|
<input type="text"
|
||||||
|
name="message"
|
||||||
|
class="form-control mc-input"
|
||||||
|
placeholder="send a message to {{ agent.name | lower }}..."
|
||||||
|
autocomplete="off"
|
||||||
|
autocorrect="off"
|
||||||
|
autocapitalize="none"
|
||||||
|
spellcheck="false"
|
||||||
|
enterkeyhint="send"
|
||||||
|
required />
|
||||||
|
<button type="submit" class="btn mc-btn-send">
|
||||||
|
SEND
|
||||||
|
<span id="agent-send-indicator" class="htmx-indicator">▋</span>
|
||||||
|
</button>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
function scrollAgentLog(id) {
|
||||||
|
const log = document.getElementById('agent-log-' + id);
|
||||||
|
if (log) log.scrollTop = log.scrollHeight;
|
||||||
|
}
|
||||||
|
</script>
|
||||||
50
src/dashboard/templates/partials/swarm_agents_sidebar.html
Normal file
50
src/dashboard/templates/partials/swarm_agents_sidebar.html
Normal file
@@ -0,0 +1,50 @@
|
|||||||
|
<div class="card-header mc-panel-header">// SWARM AGENTS</div>
|
||||||
|
<div class="card-body p-2 d-flex flex-column gap-2">
|
||||||
|
|
||||||
|
{% if not agents %}
|
||||||
|
<div style="font-size:11px; color:var(--text-dim); padding:8px 4px; letter-spacing:.08em;">
|
||||||
|
NO AGENTS REGISTERED
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
{% for agent in agents %}
|
||||||
|
{% set dot = "green" if agent.status == "idle" else ("amber" if agent.status == "busy" else "red") %}
|
||||||
|
<div class="mc-agent-card">
|
||||||
|
|
||||||
|
<div class="d-flex align-items-center gap-2 mb-1">
|
||||||
|
<span class="status-dot {{ dot }}"></span>
|
||||||
|
<span class="agent-name" style="font-size:13px;">{{ agent.name | upper }}</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="agent-meta" style="margin-bottom:8px;">
|
||||||
|
<span class="meta-key">STATUS</span>
|
||||||
|
<span class="meta-val">{{ agent.status }}</span><br>
|
||||||
|
{% if agent.capabilities %}
|
||||||
|
<span class="meta-key">CAPS</span>
|
||||||
|
<span class="meta-val" style="font-size:10px;">{{ agent.capabilities }}</span><br>
|
||||||
|
{% endif %}
|
||||||
|
<span class="meta-key">SEEN</span>
|
||||||
|
<span class="meta-val" style="font-size:10px;">{{ agent.last_seen[:19].replace("T"," ") if agent.last_seen else "—" }}</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="d-flex gap-1">
|
||||||
|
<button class="mc-btn-clear flex-fill"
|
||||||
|
style="font-size:9px; padding:4px 6px;"
|
||||||
|
hx-get="/swarm/agents/{{ agent.id }}/panel"
|
||||||
|
hx-target="#main-panel"
|
||||||
|
hx-swap="outerHTML">
|
||||||
|
CHAT
|
||||||
|
</button>
|
||||||
|
<button class="mc-btn-clear flex-fill"
|
||||||
|
style="font-size:9px; padding:4px 6px;"
|
||||||
|
hx-get="/swarm/tasks/panel?agent_id={{ agent.id }}"
|
||||||
|
hx-target="#main-panel"
|
||||||
|
hx-swap="outerHTML">
|
||||||
|
TASK
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
{% endfor %}
|
||||||
|
|
||||||
|
</div>
|
||||||
60
src/dashboard/templates/partials/task_assign_panel.html
Normal file
60
src/dashboard/templates/partials/task_assign_panel.html
Normal file
@@ -0,0 +1,60 @@
|
|||||||
|
<div id="main-panel" class="col-12 col-md-9 d-flex flex-column mc-chat-panel">
|
||||||
|
<div class="card mc-panel flex-grow-1 d-flex flex-column min-h-0">
|
||||||
|
|
||||||
|
<div class="card-header mc-panel-header d-flex justify-content-between align-items-center">
|
||||||
|
<span>// CREATE TASK</span>
|
||||||
|
<button class="mc-btn-clear"
|
||||||
|
hx-get="/agents/timmy/panel"
|
||||||
|
hx-target="#main-panel"
|
||||||
|
hx-swap="outerHTML">← TIMMY</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="flex-grow-1 overflow-auto p-3">
|
||||||
|
|
||||||
|
<form hx-post="/swarm/tasks/direct"
|
||||||
|
hx-target="#task-result"
|
||||||
|
hx-swap="innerHTML"
|
||||||
|
hx-disabled-elt="find button"
|
||||||
|
class="d-flex flex-column gap-3">
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<div style="font-size:10px; color:var(--text-dim); letter-spacing:.15em; margin-bottom:6px;">
|
||||||
|
DESCRIPTION
|
||||||
|
</div>
|
||||||
|
<textarea name="description"
|
||||||
|
class="form-control mc-input"
|
||||||
|
rows="4"
|
||||||
|
placeholder="describe what you need done..."
|
||||||
|
required
|
||||||
|
style="resize:vertical;"></textarea>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div>
|
||||||
|
<div style="font-size:10px; color:var(--text-dim); letter-spacing:.15em; margin-bottom:6px;">
|
||||||
|
ASSIGN TO
|
||||||
|
</div>
|
||||||
|
<select name="agent_id" class="form-control mc-input">
|
||||||
|
<option value="">— open auction (lowest bid wins) —</option>
|
||||||
|
{% for agent in agents %}
|
||||||
|
{% set selected = "selected" if agent.id == preselected_agent_id else "" %}
|
||||||
|
<option value="{{ agent.id }}" {{ selected }}>
|
||||||
|
{{ agent.name | upper }}
|
||||||
|
{% if agent.capabilities %} · {{ agent.capabilities }}{% endif %}
|
||||||
|
· {{ agent.status }}
|
||||||
|
</option>
|
||||||
|
{% endfor %}
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<button type="submit" class="btn mc-btn-send" style="align-self:flex-end; padding:10px 28px;">
|
||||||
|
POST TASK ▶
|
||||||
|
</button>
|
||||||
|
|
||||||
|
</form>
|
||||||
|
|
||||||
|
<div id="task-result" style="margin-top:20px;"></div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
28
src/dashboard/templates/partials/task_result.html
Normal file
28
src/dashboard/templates/partials/task_result.html
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
{% set status_color = "green" if task.status.value == "completed" else ("red" if task.status.value == "failed" else "amber") %}
|
||||||
|
<div class="mc-agent-card" style="border-left: 3px solid var(--{{ status_color }});">
|
||||||
|
<div style="font-size:10px; color:var(--text-dim); letter-spacing:.12em; margin-bottom:6px;">
|
||||||
|
TASK POSTED · {{ timestamp }}
|
||||||
|
</div>
|
||||||
|
<div style="font-size:12px; color:var(--text-bright); margin-bottom:8px;">
|
||||||
|
{{ task.description }}
|
||||||
|
</div>
|
||||||
|
<div class="d-flex gap-3" style="font-size:11px;">
|
||||||
|
<span>
|
||||||
|
<span style="color:var(--text-dim);">STATUS </span>
|
||||||
|
<span style="color:var(--{{ status_color }});">{{ task.status.value | upper }}</span>
|
||||||
|
</span>
|
||||||
|
<span>
|
||||||
|
<span style="color:var(--text-dim);">AGENT </span>
|
||||||
|
<span style="color:var(--purple);">{{ agent_name | upper }}</span>
|
||||||
|
</span>
|
||||||
|
<span>
|
||||||
|
<span style="color:var(--text-dim);">ID </span>
|
||||||
|
<span style="color:var(--text-dim);">{{ task.id[:8] }}…</span>
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
{% if task.result %}
|
||||||
|
<div style="margin-top:8px; font-size:12px; color:var(--text); border-top:1px solid var(--border); padding-top:8px;">
|
||||||
|
{{ task.result }}
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
58
src/dashboard/templates/partials/timmy_panel.html
Normal file
58
src/dashboard/templates/partials/timmy_panel.html
Normal file
@@ -0,0 +1,58 @@
|
|||||||
|
<div id="main-panel" class="col-12 col-md-9 d-flex flex-column mc-chat-panel">
|
||||||
|
<div class="card mc-panel flex-grow-1 d-flex flex-column min-h-0">
|
||||||
|
|
||||||
|
<div class="card-header mc-panel-header d-flex justify-content-between align-items-center">
|
||||||
|
<span>
|
||||||
|
{% if agent %}
|
||||||
|
<span class="status-dot {{ 'green' if agent.status == 'idle' else 'amber' }}" style="margin-right:6px;"></span>
|
||||||
|
{% endif %}
|
||||||
|
// TIMMY INTERFACE
|
||||||
|
</span>
|
||||||
|
<button class="mc-btn-clear"
|
||||||
|
hx-delete="/agents/timmy/history"
|
||||||
|
hx-target="#chat-log"
|
||||||
|
hx-swap="innerHTML"
|
||||||
|
hx-confirm="Clear conversation history?">CLEAR</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="chat-log flex-grow-1 overflow-auto p-3" id="chat-log"
|
||||||
|
hx-get="/agents/timmy/history"
|
||||||
|
hx-trigger="load"
|
||||||
|
hx-swap="innerHTML"></div>
|
||||||
|
|
||||||
|
<div class="card-footer mc-chat-footer">
|
||||||
|
<form hx-post="/agents/timmy/chat"
|
||||||
|
hx-target="#chat-log"
|
||||||
|
hx-swap="beforeend"
|
||||||
|
hx-indicator="#send-indicator"
|
||||||
|
hx-sync="this:drop"
|
||||||
|
hx-disabled-elt="find button"
|
||||||
|
hx-on::after-settle="this.reset(); scrollChat()"
|
||||||
|
class="d-flex gap-2">
|
||||||
|
<input type="text"
|
||||||
|
name="message"
|
||||||
|
class="form-control mc-input"
|
||||||
|
placeholder="send a message to timmy..."
|
||||||
|
autocomplete="off"
|
||||||
|
autocorrect="off"
|
||||||
|
autocapitalize="none"
|
||||||
|
spellcheck="false"
|
||||||
|
enterkeyhint="send"
|
||||||
|
required />
|
||||||
|
<button type="submit" class="btn mc-btn-send">
|
||||||
|
SEND
|
||||||
|
<span id="send-indicator" class="htmx-indicator">▋</span>
|
||||||
|
</button>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
function scrollChat() {
|
||||||
|
const log = document.getElementById('chat-log');
|
||||||
|
if (log) log.scrollTop = log.scrollHeight;
|
||||||
|
}
|
||||||
|
scrollChat();
|
||||||
|
</script>
|
||||||
@@ -1,17 +1,37 @@
|
|||||||
"""Sub-agent runner — entry point for spawned swarm agents.
|
"""Sub-agent runner — entry point for spawned swarm agents.
|
||||||
|
|
||||||
This module is executed as a subprocess by swarm.manager. It creates a
|
This module is executed as a subprocess (or Docker container) by
|
||||||
SwarmNode, joins the registry, and waits for tasks.
|
swarm.manager / swarm.docker_runner. It creates a SwarmNode, joins the
|
||||||
|
registry, and waits for tasks.
|
||||||
|
|
||||||
Usage:
|
Comms mode is detected automatically:
|
||||||
|
|
||||||
|
- **In-process / subprocess** (no ``COORDINATOR_URL`` env var):
|
||||||
|
Uses the shared in-memory SwarmComms channel directly.
|
||||||
|
|
||||||
|
- **Docker container** (``COORDINATOR_URL`` is set):
|
||||||
|
Polls ``GET /internal/tasks`` and submits bids via
|
||||||
|
``POST /internal/bids`` over HTTP. No in-memory state is shared
|
||||||
|
across the container boundary.
|
||||||
|
|
||||||
|
Usage
|
||||||
|
-----
|
||||||
|
::
|
||||||
|
|
||||||
|
# Subprocess (existing behaviour — unchanged)
|
||||||
python -m swarm.agent_runner --agent-id <id> --name <name>
|
python -m swarm.agent_runner --agent-id <id> --name <name>
|
||||||
|
|
||||||
|
# Docker (coordinator_url injected via env)
|
||||||
|
COORDINATOR_URL=http://dashboard:8000 \
|
||||||
|
python -m swarm.agent_runner --agent-id <id> --name <name>
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import argparse
|
import argparse
|
||||||
import asyncio
|
import asyncio
|
||||||
import logging
|
import logging
|
||||||
|
import os
|
||||||
|
import random
|
||||||
import signal
|
import signal
|
||||||
import sys
|
|
||||||
|
|
||||||
logging.basicConfig(
|
logging.basicConfig(
|
||||||
level=logging.INFO,
|
level=logging.INFO,
|
||||||
@@ -20,6 +40,92 @@ logging.basicConfig(
|
|||||||
)
|
)
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# How often a Docker agent polls for open tasks (seconds)
|
||||||
|
_HTTP_POLL_INTERVAL = 5
|
||||||
|
|
||||||
|
|
||||||
|
# ── In-process mode ───────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
async def _run_inprocess(agent_id: str, name: str, stop: asyncio.Event) -> None:
|
||||||
|
"""Run the agent using the shared in-memory SwarmComms channel."""
|
||||||
|
from swarm.swarm_node import SwarmNode
|
||||||
|
|
||||||
|
node = SwarmNode(agent_id, name)
|
||||||
|
await node.join()
|
||||||
|
logger.info("Agent %s (%s) running (in-process mode) — waiting for tasks", name, agent_id)
|
||||||
|
try:
|
||||||
|
await stop.wait()
|
||||||
|
finally:
|
||||||
|
await node.leave()
|
||||||
|
logger.info("Agent %s (%s) shut down", name, agent_id)
|
||||||
|
|
||||||
|
|
||||||
|
# ── HTTP (Docker) mode ────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
async def _run_http(
|
||||||
|
agent_id: str,
|
||||||
|
name: str,
|
||||||
|
coordinator_url: str,
|
||||||
|
capabilities: str,
|
||||||
|
stop: asyncio.Event,
|
||||||
|
) -> None:
|
||||||
|
"""Run the agent by polling the coordinator's internal HTTP API."""
|
||||||
|
try:
|
||||||
|
import httpx
|
||||||
|
except ImportError:
|
||||||
|
logger.error("httpx is required for HTTP mode — install with: pip install httpx")
|
||||||
|
return
|
||||||
|
|
||||||
|
from swarm import registry
|
||||||
|
|
||||||
|
# Register in SQLite so the coordinator can see us
|
||||||
|
registry.register(name=name, capabilities=capabilities, agent_id=agent_id)
|
||||||
|
logger.info(
|
||||||
|
"Agent %s (%s) running (HTTP mode) — polling %s every %ds",
|
||||||
|
name, agent_id, coordinator_url, _HTTP_POLL_INTERVAL,
|
||||||
|
)
|
||||||
|
|
||||||
|
base = coordinator_url.rstrip("/")
|
||||||
|
seen_tasks: set[str] = set()
|
||||||
|
|
||||||
|
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||||
|
while not stop.is_set():
|
||||||
|
try:
|
||||||
|
resp = await client.get(f"{base}/internal/tasks")
|
||||||
|
if resp.status_code == 200:
|
||||||
|
tasks = resp.json()
|
||||||
|
for task in tasks:
|
||||||
|
task_id = task["task_id"]
|
||||||
|
if task_id in seen_tasks:
|
||||||
|
continue
|
||||||
|
seen_tasks.add(task_id)
|
||||||
|
bid_sats = random.randint(10, 100)
|
||||||
|
await client.post(
|
||||||
|
f"{base}/internal/bids",
|
||||||
|
json={
|
||||||
|
"task_id": task_id,
|
||||||
|
"agent_id": agent_id,
|
||||||
|
"bid_sats": bid_sats,
|
||||||
|
"capabilities": capabilities,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
logger.info(
|
||||||
|
"Agent %s bid %d sats on task %s",
|
||||||
|
name, bid_sats, task_id,
|
||||||
|
)
|
||||||
|
except Exception as exc:
|
||||||
|
logger.warning("HTTP poll error: %s", exc)
|
||||||
|
|
||||||
|
try:
|
||||||
|
await asyncio.wait_for(stop.wait(), timeout=_HTTP_POLL_INTERVAL)
|
||||||
|
except asyncio.TimeoutError:
|
||||||
|
pass # normal — just means the stop event wasn't set
|
||||||
|
|
||||||
|
registry.update_status(agent_id, "offline")
|
||||||
|
logger.info("Agent %s (%s) shut down", name, agent_id)
|
||||||
|
|
||||||
|
|
||||||
|
# ── Entry point ───────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
async def main() -> None:
|
async def main() -> None:
|
||||||
parser = argparse.ArgumentParser(description="Swarm sub-agent runner")
|
parser = argparse.ArgumentParser(description="Swarm sub-agent runner")
|
||||||
@@ -27,29 +133,24 @@ async def main() -> None:
|
|||||||
parser.add_argument("--name", required=True, help="Human-readable agent name")
|
parser.add_argument("--name", required=True, help="Human-readable agent name")
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
# Lazy import to avoid circular deps at module level
|
agent_id = args.agent_id
|
||||||
from swarm.swarm_node import SwarmNode
|
name = args.name
|
||||||
|
coordinator_url = os.environ.get("COORDINATOR_URL", "")
|
||||||
|
capabilities = os.environ.get("AGENT_CAPABILITIES", "")
|
||||||
|
|
||||||
node = SwarmNode(args.agent_id, args.name)
|
|
||||||
await node.join()
|
|
||||||
|
|
||||||
logger.info("Agent %s (%s) running — waiting for tasks", args.name, args.agent_id)
|
|
||||||
|
|
||||||
# Run until terminated
|
|
||||||
stop = asyncio.Event()
|
stop = asyncio.Event()
|
||||||
|
|
||||||
def _handle_signal(*_):
|
def _handle_signal(*_):
|
||||||
logger.info("Agent %s received shutdown signal", args.name)
|
logger.info("Agent %s received shutdown signal", name)
|
||||||
stop.set()
|
stop.set()
|
||||||
|
|
||||||
for sig in (signal.SIGTERM, signal.SIGINT):
|
for sig in (signal.SIGTERM, signal.SIGINT):
|
||||||
signal.signal(sig, _handle_signal)
|
signal.signal(sig, _handle_signal)
|
||||||
|
|
||||||
try:
|
if coordinator_url:
|
||||||
await stop.wait()
|
await _run_http(agent_id, name, coordinator_url, capabilities, stop)
|
||||||
finally:
|
else:
|
||||||
await node.leave()
|
await _run_inprocess(agent_id, name, stop)
|
||||||
logger.info("Agent %s (%s) shut down", args.name, args.agent_id)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
|||||||
@@ -14,6 +14,7 @@ from typing import Optional
|
|||||||
from swarm.bidder import AuctionManager, Bid
|
from swarm.bidder import AuctionManager, Bid
|
||||||
from swarm.comms import SwarmComms
|
from swarm.comms import SwarmComms
|
||||||
from swarm.manager import SwarmManager
|
from swarm.manager import SwarmManager
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
from swarm.registry import AgentRecord
|
from swarm.registry import AgentRecord
|
||||||
from swarm import registry
|
from swarm import registry
|
||||||
from swarm import stats as swarm_stats
|
from swarm import stats as swarm_stats
|
||||||
@@ -37,6 +38,7 @@ class SwarmCoordinator:
|
|||||||
self.auctions = AuctionManager()
|
self.auctions = AuctionManager()
|
||||||
self.comms = SwarmComms()
|
self.comms = SwarmComms()
|
||||||
self._in_process_nodes: list = []
|
self._in_process_nodes: list = []
|
||||||
|
self._recovery_summary = reconcile_on_startup()
|
||||||
|
|
||||||
# ── Agent lifecycle ─────────────────────────────────────────────────────
|
# ── Agent lifecycle ─────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|||||||
187
src/swarm/docker_runner.py
Normal file
187
src/swarm/docker_runner.py
Normal file
@@ -0,0 +1,187 @@
|
|||||||
|
"""Docker-backed agent runner — spawn swarm agents as isolated containers.
|
||||||
|
|
||||||
|
Drop-in complement to SwarmManager. Instead of Python subprocesses,
|
||||||
|
DockerAgentRunner launches each agent as a Docker container that shares
|
||||||
|
the data volume and communicates with the coordinator over HTTP.
|
||||||
|
|
||||||
|
Requirements
|
||||||
|
------------
|
||||||
|
- Docker Engine running on the host (``docker`` CLI in PATH)
|
||||||
|
- The ``timmy-time:latest`` image already built (``make docker-build``)
|
||||||
|
- ``data/`` directory exists and is mounted at ``/app/data`` in each container
|
||||||
|
|
||||||
|
Communication
|
||||||
|
-------------
|
||||||
|
Container agents use the coordinator's internal HTTP API rather than the
|
||||||
|
in-memory SwarmComms channel::
|
||||||
|
|
||||||
|
GET /internal/tasks → poll for tasks open for bidding
|
||||||
|
POST /internal/bids → submit a bid
|
||||||
|
|
||||||
|
The ``COORDINATOR_URL`` env var tells agents where to reach the coordinator.
|
||||||
|
Inside the docker-compose network this is ``http://dashboard:8000``.
|
||||||
|
From the host it is typically ``http://localhost:8000``.
|
||||||
|
|
||||||
|
Usage
|
||||||
|
-----
|
||||||
|
::
|
||||||
|
|
||||||
|
from swarm.docker_runner import DockerAgentRunner
|
||||||
|
|
||||||
|
runner = DockerAgentRunner()
|
||||||
|
info = runner.spawn("Echo", capabilities="summarise,translate")
|
||||||
|
print(info) # {"container_id": "...", "name": "Echo", "agent_id": "..."}
|
||||||
|
|
||||||
|
runner.stop(info["container_id"])
|
||||||
|
runner.stop_all()
|
||||||
|
"""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import subprocess
|
||||||
|
import uuid
|
||||||
|
from dataclasses import dataclass, field
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
DEFAULT_IMAGE = "timmy-time:latest"
|
||||||
|
DEFAULT_COORDINATOR_URL = "http://dashboard:8000"
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ManagedContainer:
|
||||||
|
container_id: str
|
||||||
|
agent_id: str
|
||||||
|
name: str
|
||||||
|
image: str
|
||||||
|
capabilities: str = ""
|
||||||
|
|
||||||
|
|
||||||
|
class DockerAgentRunner:
|
||||||
|
"""Spawn and manage swarm agents as Docker containers."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
image: str = DEFAULT_IMAGE,
|
||||||
|
coordinator_url: str = DEFAULT_COORDINATOR_URL,
|
||||||
|
extra_env: Optional[dict] = None,
|
||||||
|
) -> None:
|
||||||
|
self.image = image
|
||||||
|
self.coordinator_url = coordinator_url
|
||||||
|
self.extra_env = extra_env or {}
|
||||||
|
self._containers: dict[str, ManagedContainer] = {}
|
||||||
|
|
||||||
|
# ── Public API ────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def spawn(
|
||||||
|
self,
|
||||||
|
name: str,
|
||||||
|
agent_id: Optional[str] = None,
|
||||||
|
capabilities: str = "",
|
||||||
|
image: Optional[str] = None,
|
||||||
|
) -> dict:
|
||||||
|
"""Spawn a new agent container and return its info dict.
|
||||||
|
|
||||||
|
The container runs ``python -m swarm.agent_runner`` and communicates
|
||||||
|
with the coordinator over HTTP via ``COORDINATOR_URL``.
|
||||||
|
"""
|
||||||
|
aid = agent_id or str(uuid.uuid4())
|
||||||
|
img = image or self.image
|
||||||
|
container_name = f"timmy-agent-{aid[:8]}"
|
||||||
|
|
||||||
|
env_flags = self._build_env_flags(aid, name, capabilities)
|
||||||
|
|
||||||
|
cmd = [
|
||||||
|
"docker", "run",
|
||||||
|
"--detach",
|
||||||
|
"--name", container_name,
|
||||||
|
"--network", "timmy-time_swarm-net",
|
||||||
|
"--volume", "timmy-time_timmy-data:/app/data",
|
||||||
|
"--extra-hosts", "host.docker.internal:host-gateway",
|
||||||
|
*env_flags,
|
||||||
|
img,
|
||||||
|
"python", "-m", "swarm.agent_runner",
|
||||||
|
"--agent-id", aid,
|
||||||
|
"--name", name,
|
||||||
|
]
|
||||||
|
|
||||||
|
try:
|
||||||
|
result = subprocess.run(
|
||||||
|
cmd, capture_output=True, text=True, timeout=15
|
||||||
|
)
|
||||||
|
if result.returncode != 0:
|
||||||
|
raise RuntimeError(result.stderr.strip())
|
||||||
|
container_id = result.stdout.strip()
|
||||||
|
except FileNotFoundError:
|
||||||
|
raise RuntimeError(
|
||||||
|
"Docker CLI not found. Is Docker Desktop running?"
|
||||||
|
)
|
||||||
|
|
||||||
|
managed = ManagedContainer(
|
||||||
|
container_id=container_id,
|
||||||
|
agent_id=aid,
|
||||||
|
name=name,
|
||||||
|
image=img,
|
||||||
|
capabilities=capabilities,
|
||||||
|
)
|
||||||
|
self._containers[container_id] = managed
|
||||||
|
logger.info(
|
||||||
|
"Docker agent %s (%s) started — container %s",
|
||||||
|
name, aid, container_id[:12],
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"container_id": container_id,
|
||||||
|
"agent_id": aid,
|
||||||
|
"name": name,
|
||||||
|
"image": img,
|
||||||
|
"capabilities": capabilities,
|
||||||
|
}
|
||||||
|
|
||||||
|
def stop(self, container_id: str) -> bool:
|
||||||
|
"""Stop and remove a container agent."""
|
||||||
|
try:
|
||||||
|
subprocess.run(
|
||||||
|
["docker", "rm", "-f", container_id],
|
||||||
|
capture_output=True, timeout=10,
|
||||||
|
)
|
||||||
|
self._containers.pop(container_id, None)
|
||||||
|
logger.info("Docker agent container %s stopped", container_id[:12])
|
||||||
|
return True
|
||||||
|
except Exception as exc:
|
||||||
|
logger.error("Failed to stop container %s: %s", container_id[:12], exc)
|
||||||
|
return False
|
||||||
|
|
||||||
|
def stop_all(self) -> int:
|
||||||
|
"""Stop all containers managed by this runner."""
|
||||||
|
ids = list(self._containers.keys())
|
||||||
|
stopped = sum(1 for cid in ids if self.stop(cid))
|
||||||
|
return stopped
|
||||||
|
|
||||||
|
def list_containers(self) -> list[ManagedContainer]:
|
||||||
|
return list(self._containers.values())
|
||||||
|
|
||||||
|
def is_running(self, container_id: str) -> bool:
|
||||||
|
"""Return True if the container is currently running."""
|
||||||
|
try:
|
||||||
|
result = subprocess.run(
|
||||||
|
["docker", "inspect", "--format", "{{.State.Running}}", container_id],
|
||||||
|
capture_output=True, text=True, timeout=5,
|
||||||
|
)
|
||||||
|
return result.stdout.strip() == "true"
|
||||||
|
except Exception:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# ── Internal ──────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _build_env_flags(self, agent_id: str, name: str, capabilities: str) -> list[str]:
|
||||||
|
env = {
|
||||||
|
"COORDINATOR_URL": self.coordinator_url,
|
||||||
|
"AGENT_NAME": name,
|
||||||
|
"AGENT_ID": agent_id,
|
||||||
|
"AGENT_CAPABILITIES": capabilities,
|
||||||
|
**self.extra_env,
|
||||||
|
}
|
||||||
|
flags = []
|
||||||
|
for k, v in env.items():
|
||||||
|
flags += ["--env", f"{k}={v}"]
|
||||||
|
return flags
|
||||||
90
src/swarm/recovery.py
Normal file
90
src/swarm/recovery.py
Normal file
@@ -0,0 +1,90 @@
|
|||||||
|
"""Swarm startup recovery — reconcile SQLite state after a restart.
|
||||||
|
|
||||||
|
When the server stops unexpectedly, tasks may be left in BIDDING, ASSIGNED,
|
||||||
|
or RUNNING states, and agents may still appear as 'idle' or 'busy' in the
|
||||||
|
registry even though no live process backs them.
|
||||||
|
|
||||||
|
``reconcile_on_startup()`` is called once during coordinator initialisation.
|
||||||
|
It performs two lightweight SQLite operations:
|
||||||
|
|
||||||
|
1. **Orphaned tasks** — any task in BIDDING, ASSIGNED, or RUNNING is moved
|
||||||
|
to FAILED with a ``result`` explaining the reason. PENDING tasks are left
|
||||||
|
alone (they haven't been touched yet and can be re-auctioned).
|
||||||
|
|
||||||
|
2. **Stale agents** — every agent record that is not already 'offline' is
|
||||||
|
marked 'offline'. Agents re-register themselves when they re-spawn; the
|
||||||
|
coordinator singleton stays the source of truth for which nodes are live.
|
||||||
|
|
||||||
|
The function returns a summary dict useful for logging and tests.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
|
||||||
|
from swarm import registry
|
||||||
|
from swarm.tasks import TaskStatus, list_tasks, update_task
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
#: Task statuses that indicate in-flight work that can't resume after restart.
|
||||||
|
_ORPHAN_STATUSES = {TaskStatus.BIDDING, TaskStatus.ASSIGNED, TaskStatus.RUNNING}
|
||||||
|
|
||||||
|
|
||||||
|
def reconcile_on_startup() -> dict:
|
||||||
|
"""Reconcile swarm SQLite state after a server restart.
|
||||||
|
|
||||||
|
Returns a dict with keys:
|
||||||
|
tasks_failed - number of orphaned tasks moved to FAILED
|
||||||
|
agents_offlined - number of stale agent records marked offline
|
||||||
|
"""
|
||||||
|
tasks_failed = _rescue_orphaned_tasks()
|
||||||
|
agents_offlined = _offline_stale_agents()
|
||||||
|
|
||||||
|
summary = {"tasks_failed": tasks_failed, "agents_offlined": agents_offlined}
|
||||||
|
|
||||||
|
if tasks_failed or agents_offlined:
|
||||||
|
logger.info(
|
||||||
|
"Swarm recovery: %d task(s) failed, %d agent(s) offlined",
|
||||||
|
tasks_failed,
|
||||||
|
agents_offlined,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.debug("Swarm recovery: nothing to reconcile")
|
||||||
|
|
||||||
|
return summary
|
||||||
|
|
||||||
|
|
||||||
|
# ── Internal helpers ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
|
||||||
|
def _rescue_orphaned_tasks() -> int:
|
||||||
|
"""Move BIDDING / ASSIGNED / RUNNING tasks to FAILED.
|
||||||
|
|
||||||
|
Returns the count of tasks updated.
|
||||||
|
"""
|
||||||
|
now = datetime.now(timezone.utc).isoformat()
|
||||||
|
count = 0
|
||||||
|
for task in list_tasks():
|
||||||
|
if task.status in _ORPHAN_STATUSES:
|
||||||
|
update_task(
|
||||||
|
task.id,
|
||||||
|
status=TaskStatus.FAILED,
|
||||||
|
result="Server restarted — task did not complete.",
|
||||||
|
completed_at=now,
|
||||||
|
)
|
||||||
|
count += 1
|
||||||
|
return count
|
||||||
|
|
||||||
|
|
||||||
|
def _offline_stale_agents() -> int:
|
||||||
|
"""Mark every non-offline agent as 'offline'.
|
||||||
|
|
||||||
|
Returns the count of agent records updated.
|
||||||
|
"""
|
||||||
|
agents = registry.list_agents()
|
||||||
|
count = 0
|
||||||
|
for agent in agents:
|
||||||
|
if agent.status != "offline":
|
||||||
|
registry.update_status(agent.id, "offline")
|
||||||
|
count += 1
|
||||||
|
return count
|
||||||
139
src/timmy/docker_agent.py
Normal file
139
src/timmy/docker_agent.py
Normal file
@@ -0,0 +1,139 @@
|
|||||||
|
"""Timmy — standalone Docker container entry point.
|
||||||
|
|
||||||
|
Runs Timmy as an independent swarm participant:
|
||||||
|
1. Registers "timmy" in the SQLite registry with capabilities
|
||||||
|
2. Sends heartbeats every 30 s so the dashboard can track liveness
|
||||||
|
3. Polls the coordinator for tasks assigned to "timmy"
|
||||||
|
4. Executes them through the Agno/Ollama backend
|
||||||
|
5. Marks each task COMPLETED (or FAILED) via the internal HTTP API
|
||||||
|
|
||||||
|
Usage (Docker)::
|
||||||
|
|
||||||
|
COORDINATOR_URL=http://dashboard:8000 \
|
||||||
|
OLLAMA_URL=http://host.docker.internal:11434 \
|
||||||
|
python -m timmy.docker_agent
|
||||||
|
|
||||||
|
Environment variables
|
||||||
|
---------------------
|
||||||
|
COORDINATOR_URL Where to reach the dashboard (required)
|
||||||
|
OLLAMA_URL Ollama base URL (default: http://localhost:11434)
|
||||||
|
TIMMY_AGENT_ID Override the registry ID (default: "timmy")
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import signal
|
||||||
|
|
||||||
|
import httpx
|
||||||
|
|
||||||
|
from swarm import registry
|
||||||
|
|
||||||
|
logging.basicConfig(
|
||||||
|
level=logging.INFO,
|
||||||
|
format="%(asctime)s %(levelname)-8s %(name)s — %(message)s",
|
||||||
|
datefmt="%H:%M:%S",
|
||||||
|
)
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
AGENT_ID = os.environ.get("TIMMY_AGENT_ID", "timmy")
|
||||||
|
COORDINATOR = os.environ.get("COORDINATOR_URL", "").rstrip("/")
|
||||||
|
POLL_INTERVAL = 5 # seconds between task polls
|
||||||
|
HEARTBEAT_INTERVAL = 30
|
||||||
|
|
||||||
|
|
||||||
|
async def _run_task(task_id: str, description: str, client: httpx.AsyncClient) -> None:
|
||||||
|
"""Execute a task using Timmy's AI backend and report the result."""
|
||||||
|
logger.info("Timmy executing task %s: %s", task_id, description[:60])
|
||||||
|
result = None
|
||||||
|
try:
|
||||||
|
from timmy.agent import create_timmy
|
||||||
|
agent = create_timmy()
|
||||||
|
run = agent.run(description, stream=False)
|
||||||
|
result = run.content if hasattr(run, "content") else str(run)
|
||||||
|
logger.info("Task %s completed", task_id)
|
||||||
|
except Exception as exc:
|
||||||
|
result = f"Timmy error: {exc}"
|
||||||
|
logger.warning("Task %s failed: %s", task_id, exc)
|
||||||
|
|
||||||
|
# Report back to coordinator via HTTP
|
||||||
|
try:
|
||||||
|
await client.post(
|
||||||
|
f"{COORDINATOR}/swarm/tasks/{task_id}/complete",
|
||||||
|
data={"result": result or "(no output)"},
|
||||||
|
)
|
||||||
|
except Exception as exc:
|
||||||
|
logger.error("Could not report task %s result: %s", task_id, exc)
|
||||||
|
|
||||||
|
|
||||||
|
async def _heartbeat_loop(stop: asyncio.Event) -> None:
|
||||||
|
while not stop.is_set():
|
||||||
|
try:
|
||||||
|
registry.heartbeat(AGENT_ID)
|
||||||
|
except Exception as exc:
|
||||||
|
logger.warning("Heartbeat error: %s", exc)
|
||||||
|
try:
|
||||||
|
await asyncio.wait_for(stop.wait(), timeout=HEARTBEAT_INTERVAL)
|
||||||
|
except asyncio.TimeoutError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
async def _task_loop(stop: asyncio.Event) -> None:
|
||||||
|
seen: set[str] = set()
|
||||||
|
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||||
|
while not stop.is_set():
|
||||||
|
try:
|
||||||
|
resp = await client.get(f"{COORDINATOR}/swarm/tasks?status=assigned")
|
||||||
|
if resp.status_code == 200:
|
||||||
|
for task in resp.json().get("tasks", []):
|
||||||
|
if task.get("assigned_agent") != AGENT_ID:
|
||||||
|
continue
|
||||||
|
task_id = task["id"]
|
||||||
|
if task_id in seen:
|
||||||
|
continue
|
||||||
|
seen.add(task_id)
|
||||||
|
asyncio.create_task(
|
||||||
|
_run_task(task_id, task["description"], client)
|
||||||
|
)
|
||||||
|
except Exception as exc:
|
||||||
|
logger.warning("Task poll error: %s", exc)
|
||||||
|
|
||||||
|
try:
|
||||||
|
await asyncio.wait_for(stop.wait(), timeout=POLL_INTERVAL)
|
||||||
|
except asyncio.TimeoutError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
async def main() -> None:
|
||||||
|
if not COORDINATOR:
|
||||||
|
logger.error("COORDINATOR_URL is not set — exiting")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Register Timmy in the shared SQLite registry
|
||||||
|
registry.register(
|
||||||
|
name="Timmy",
|
||||||
|
capabilities="chat,reasoning,research,planning",
|
||||||
|
agent_id=AGENT_ID,
|
||||||
|
)
|
||||||
|
logger.info("Timmy registered (id=%s) — coordinator: %s", AGENT_ID, COORDINATOR)
|
||||||
|
|
||||||
|
stop = asyncio.Event()
|
||||||
|
|
||||||
|
def _handle_signal(*_):
|
||||||
|
logger.info("Timmy received shutdown signal")
|
||||||
|
stop.set()
|
||||||
|
|
||||||
|
for sig in (signal.SIGTERM, signal.SIGINT):
|
||||||
|
signal.signal(sig, _handle_signal)
|
||||||
|
|
||||||
|
await asyncio.gather(
|
||||||
|
_heartbeat_loop(stop),
|
||||||
|
_task_loop(stop),
|
||||||
|
)
|
||||||
|
|
||||||
|
registry.update_status(AGENT_ID, "offline")
|
||||||
|
logger.info("Timmy shut down")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
asyncio.run(main())
|
||||||
139
static/bg.svg
Normal file
139
static/bg.svg
Normal file
@@ -0,0 +1,139 @@
|
|||||||
|
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 1440 900" preserveAspectRatio="xMidYMid slice">
|
||||||
|
<defs>
|
||||||
|
|
||||||
|
<!-- Nebula clouds -->
|
||||||
|
<radialGradient id="neb1" cx="22%" cy="32%" r="48%">
|
||||||
|
<stop offset="0%" stop-color="#7c3aed" stop-opacity="0.55"/>
|
||||||
|
<stop offset="55%" stop-color="#4c1d95" stop-opacity="0.18"/>
|
||||||
|
<stop offset="100%" stop-color="#080412" stop-opacity="0"/>
|
||||||
|
</radialGradient>
|
||||||
|
|
||||||
|
<radialGradient id="neb2" cx="78%" cy="68%" r="42%">
|
||||||
|
<stop offset="0%" stop-color="#f97316" stop-opacity="0.42"/>
|
||||||
|
<stop offset="50%" stop-color="#c2410c" stop-opacity="0.14"/>
|
||||||
|
<stop offset="100%" stop-color="#080412" stop-opacity="0"/>
|
||||||
|
</radialGradient>
|
||||||
|
|
||||||
|
<radialGradient id="neb3" cx="66%" cy="22%" r="36%">
|
||||||
|
<stop offset="0%" stop-color="#a855f7" stop-opacity="0.38"/>
|
||||||
|
<stop offset="100%" stop-color="#080412" stop-opacity="0"/>
|
||||||
|
</radialGradient>
|
||||||
|
|
||||||
|
<radialGradient id="neb4" cx="14%" cy="78%" r="32%">
|
||||||
|
<stop offset="0%" stop-color="#ea580c" stop-opacity="0.32"/>
|
||||||
|
<stop offset="100%" stop-color="#080412" stop-opacity="0"/>
|
||||||
|
</radialGradient>
|
||||||
|
|
||||||
|
<radialGradient id="neb5" cx="48%" cy="55%" r="28%">
|
||||||
|
<stop offset="0%" stop-color="#6d28d9" stop-opacity="0.22"/>
|
||||||
|
<stop offset="100%" stop-color="#080412" stop-opacity="0"/>
|
||||||
|
</radialGradient>
|
||||||
|
|
||||||
|
<!-- Bright core hotspots -->
|
||||||
|
<radialGradient id="core1" cx="24%" cy="34%" r="12%">
|
||||||
|
<stop offset="0%" stop-color="#ddd6fe" stop-opacity="0.18"/>
|
||||||
|
<stop offset="100%" stop-color="#7c3aed" stop-opacity="0"/>
|
||||||
|
</radialGradient>
|
||||||
|
|
||||||
|
<radialGradient id="core2" cx="76%" cy="66%" r="10%">
|
||||||
|
<stop offset="0%" stop-color="#fed7aa" stop-opacity="0.22"/>
|
||||||
|
<stop offset="100%" stop-color="#f97316" stop-opacity="0"/>
|
||||||
|
</radialGradient>
|
||||||
|
|
||||||
|
<!-- Star glow filter -->
|
||||||
|
<filter id="sg" x="-200%" y="-200%" width="500%" height="500%">
|
||||||
|
<feGaussianBlur stdDeviation="1.2" result="b"/>
|
||||||
|
<feMerge><feMergeNode in="b"/><feMergeNode in="SourceGraphic"/></feMerge>
|
||||||
|
</filter>
|
||||||
|
<filter id="sg2" x="-300%" y="-300%" width="700%" height="700%">
|
||||||
|
<feGaussianBlur stdDeviation="2.5" result="b"/>
|
||||||
|
<feMerge><feMergeNode in="b"/><feMergeNode in="SourceGraphic"/></feMerge>
|
||||||
|
</filter>
|
||||||
|
|
||||||
|
</defs>
|
||||||
|
|
||||||
|
<!-- Base -->
|
||||||
|
<rect width="1440" height="900" fill="#080412"/>
|
||||||
|
|
||||||
|
<!-- Nebula layers -->
|
||||||
|
<rect width="1440" height="900" fill="url(#neb1)"/>
|
||||||
|
<rect width="1440" height="900" fill="url(#neb2)"/>
|
||||||
|
<rect width="1440" height="900" fill="url(#neb3)"/>
|
||||||
|
<rect width="1440" height="900" fill="url(#neb4)"/>
|
||||||
|
<rect width="1440" height="900" fill="url(#neb5)"/>
|
||||||
|
<rect width="1440" height="900" fill="url(#core1)"/>
|
||||||
|
<rect width="1440" height="900" fill="url(#core2)"/>
|
||||||
|
|
||||||
|
<!-- Mystical halo rings around nebula cores -->
|
||||||
|
<circle cx="317" cy="288" r="130" fill="none" stroke="#7c3aed" stroke-width="0.6" opacity="0.18"/>
|
||||||
|
<circle cx="317" cy="288" r="220" fill="none" stroke="#a855f7" stroke-width="0.35" opacity="0.1"/>
|
||||||
|
<circle cx="317" cy="288" r="340" fill="none" stroke="#7c3aed" stroke-width="0.2" opacity="0.06"/>
|
||||||
|
<circle cx="1123" cy="612" r="100" fill="none" stroke="#f97316" stroke-width="0.6" opacity="0.15"/>
|
||||||
|
<circle cx="1123" cy="612" r="180" fill="none" stroke="#ea580c" stroke-width="0.3" opacity="0.08"/>
|
||||||
|
<circle cx="950" cy="198" r="80" fill="none" stroke="#c084fc" stroke-width="0.5" opacity="0.14"/>
|
||||||
|
<circle cx="200" cy="702" r="70" fill="none" stroke="#fb923c" stroke-width="0.4" opacity="0.12"/>
|
||||||
|
|
||||||
|
<!-- Stars — field (small) -->
|
||||||
|
<circle cx="72" cy="43" r="0.8" fill="white" opacity="0.88"/>
|
||||||
|
<circle cx="198" cy="127" r="1.1" fill="white" opacity="0.70"/>
|
||||||
|
<circle cx="334" cy="56" r="0.7" fill="white" opacity="0.82"/>
|
||||||
|
<circle cx="456" cy="178" r="1.0" fill="white" opacity="0.60"/>
|
||||||
|
<circle cx="612" cy="89" r="1.4" fill="white" opacity="0.78" filter="url(#sg)"/>
|
||||||
|
<circle cx="745" cy="145" r="0.9" fill="white" opacity="0.68"/>
|
||||||
|
<circle cx="867" cy="38" r="1.1" fill="white" opacity="0.90"/>
|
||||||
|
<circle cx="1023" cy="115" r="0.8" fill="white" opacity="0.63"/>
|
||||||
|
<circle cx="1156" cy="72" r="1.2" fill="white" opacity="0.80"/>
|
||||||
|
<circle cx="1289" cy="158" r="0.7" fill="white" opacity="0.70"/>
|
||||||
|
<circle cx="1398" cy="45" r="1.0" fill="white" opacity="0.84"/>
|
||||||
|
<circle cx="134" cy="234" r="0.9" fill="white" opacity="0.60"/>
|
||||||
|
<circle cx="267" cy="312" r="1.3" fill="#e9d5ff" opacity="0.68" filter="url(#sg)"/>
|
||||||
|
<circle cx="389" cy="267" r="0.8" fill="white" opacity="0.78"/>
|
||||||
|
<circle cx="523" cy="345" r="1.0" fill="white" opacity="0.64"/>
|
||||||
|
<circle cx="678" cy="223" r="0.7" fill="white" opacity="0.74"/>
|
||||||
|
<circle cx="812" cy="378" r="1.1" fill="#fed7aa" opacity="0.58"/>
|
||||||
|
<circle cx="934" cy="256" r="0.9" fill="white" opacity="0.84"/>
|
||||||
|
<circle cx="1089" cy="334" r="1.0" fill="white" opacity="0.68"/>
|
||||||
|
<circle cx="1234" cy="245" r="0.8" fill="white" opacity="0.78"/>
|
||||||
|
<circle cx="1367" cy="312" r="1.1" fill="#e9d5ff" opacity="0.63"/>
|
||||||
|
<circle cx="56" cy="467" r="1.1" fill="white" opacity="0.68"/>
|
||||||
|
<circle cx="189" cy="523" r="0.8" fill="white" opacity="0.84"/>
|
||||||
|
<circle cx="323" cy="445" r="1.0" fill="white" opacity="0.58"/>
|
||||||
|
<circle cx="478" cy="589" r="0.7" fill="white" opacity="0.73"/>
|
||||||
|
<circle cx="601" cy="478" r="1.2" fill="#e9d5ff" opacity="0.78" filter="url(#sg)"/>
|
||||||
|
<circle cx="756" cy="534" r="0.9" fill="white" opacity="0.63"/>
|
||||||
|
<circle cx="890" cy="467" r="1.0" fill="white" opacity="0.68"/>
|
||||||
|
<circle cx="1023" cy="578" r="0.8" fill="white" opacity="0.78"/>
|
||||||
|
<circle cx="1167" cy="489" r="1.3" fill="#fed7aa" opacity="0.58"/>
|
||||||
|
<circle cx="1312" cy="534" r="0.9" fill="white" opacity="0.73"/>
|
||||||
|
<circle cx="1423" cy="467" r="1.0" fill="white" opacity="0.84"/>
|
||||||
|
<circle cx="112" cy="645" r="0.8" fill="white" opacity="0.68"/>
|
||||||
|
<circle cx="245" cy="712" r="1.1" fill="white" opacity="0.63"/>
|
||||||
|
<circle cx="378" cy="667" r="0.9" fill="white" opacity="0.78"/>
|
||||||
|
<circle cx="534" cy="734" r="1.0" fill="white" opacity="0.73"/>
|
||||||
|
<circle cx="667" cy="656" r="0.7" fill="#e9d5ff" opacity="0.68" filter="url(#sg)"/>
|
||||||
|
<circle cx="823" cy="723" r="1.0" fill="white" opacity="0.58"/>
|
||||||
|
<circle cx="956" cy="667" r="0.8" fill="white" opacity="0.83"/>
|
||||||
|
<circle cx="1112" cy="734" r="1.2" fill="white" opacity="0.68"/>
|
||||||
|
<circle cx="1245" cy="656" r="0.9" fill="white" opacity="0.73"/>
|
||||||
|
<circle cx="1389" cy="712" r="1.0" fill="#fed7aa" opacity="0.63"/>
|
||||||
|
<circle cx="89" cy="812" r="1.0" fill="white" opacity="0.63"/>
|
||||||
|
<circle cx="234" cy="856" r="0.8" fill="white" opacity="0.78"/>
|
||||||
|
<circle cx="389" cy="823" r="1.1" fill="#e9d5ff" opacity="0.68"/>
|
||||||
|
<circle cx="534" cy="878" r="0.7" fill="white" opacity="0.73"/>
|
||||||
|
<circle cx="667" cy="845" r="0.9" fill="white" opacity="0.58"/>
|
||||||
|
<circle cx="812" cy="867" r="1.0" fill="white" opacity="0.84"/>
|
||||||
|
<circle cx="956" cy="823" r="0.8" fill="white" opacity="0.68"/>
|
||||||
|
<circle cx="1112" cy="878" r="0.9" fill="white" opacity="0.63"/>
|
||||||
|
<circle cx="1256" cy="834" r="1.2" fill="#fed7aa" opacity="0.73"/>
|
||||||
|
<circle cx="1389" cy="867" r="0.9" fill="white" opacity="0.78"/>
|
||||||
|
|
||||||
|
<!-- Hero stars (brighter, larger glow) -->
|
||||||
|
<circle cx="420" cy="165" r="2.2" fill="white" opacity="0.92" filter="url(#sg2)"/>
|
||||||
|
<circle cx="1100" cy="290" r="1.9" fill="#e9d5ff" opacity="0.90" filter="url(#sg2)"/>
|
||||||
|
<circle cx="720" cy="440" r="2.4" fill="white" opacity="0.88" filter="url(#sg2)"/>
|
||||||
|
<circle cx="195" cy="615" r="2.0" fill="#fed7aa" opacity="0.82" filter="url(#sg2)"/>
|
||||||
|
<circle cx="1270" cy="710" r="2.1" fill="#e9d5ff" opacity="0.86" filter="url(#sg2)"/>
|
||||||
|
<circle cx="950" cy="198" r="2.3" fill="#f0abfc" opacity="0.84" filter="url(#sg2)"/>
|
||||||
|
<circle cx="580" cy="760" r="1.8" fill="#fed7aa" opacity="0.80" filter="url(#sg2)"/>
|
||||||
|
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 7.6 KiB |
@@ -1,20 +1,22 @@
|
|||||||
/* ── Mission Control palette ──────────────────────── */
|
/* ── Arcane palette ────────────────────────────────── */
|
||||||
:root {
|
:root {
|
||||||
--bg-deep: #060d14;
|
--bg-deep: #080412;
|
||||||
--bg-panel: #0c1824;
|
--bg-panel: #110820;
|
||||||
--bg-card: #0f2030;
|
--bg-card: #180d2e;
|
||||||
--border: #1a3a55;
|
--border: #3b1a5c;
|
||||||
--border-glow: #1e4d72;
|
--border-glow: #7c3aed;
|
||||||
--text: #b8d0e8;
|
--text: #c8b0e0;
|
||||||
--text-dim: #4a7a9a;
|
--text-dim: #6b4a8a;
|
||||||
--text-bright: #ddeeff;
|
--text-bright: #ede0ff;
|
||||||
--green: #00e87a;
|
--green: #00e87a;
|
||||||
--green-dim: #00704a;
|
--green-dim: #00704a;
|
||||||
--amber: #ffb800;
|
--amber: #ffb800;
|
||||||
--amber-dim: #7a5800;
|
--amber-dim: #7a5800;
|
||||||
--red: #ff4455;
|
--red: #ff4455;
|
||||||
--red-dim: #7a1a22;
|
--red-dim: #7a1a22;
|
||||||
--blue: #00aaff;
|
--blue: #ff7a2a; /* orange replaces blue as the primary accent */
|
||||||
|
--orange: #ff7a2a;
|
||||||
|
--purple: #a855f7;
|
||||||
--font: 'JetBrains Mono', 'Courier New', monospace;
|
--font: 'JetBrains Mono', 'Courier New', monospace;
|
||||||
--header-h: 52px;
|
--header-h: 52px;
|
||||||
|
|
||||||
@@ -36,7 +38,10 @@
|
|||||||
|
|
||||||
body {
|
body {
|
||||||
font-family: var(--font);
|
font-family: var(--font);
|
||||||
background: var(--bg-deep);
|
background-color: var(--bg-deep);
|
||||||
|
background-image: url('/static/bg.svg');
|
||||||
|
background-size: cover;
|
||||||
|
background-position: center top;
|
||||||
color: var(--text);
|
color: var(--text);
|
||||||
font-size: 13px;
|
font-size: 13px;
|
||||||
min-height: 100dvh;
|
min-height: 100dvh;
|
||||||
@@ -51,7 +56,9 @@ body {
|
|||||||
align-items: center;
|
align-items: center;
|
||||||
padding: 12px 24px;
|
padding: 12px 24px;
|
||||||
padding-top: max(12px, env(safe-area-inset-top));
|
padding-top: max(12px, env(safe-area-inset-top));
|
||||||
background: var(--bg-panel);
|
background: rgba(17, 8, 32, 0.86);
|
||||||
|
backdrop-filter: blur(14px);
|
||||||
|
-webkit-backdrop-filter: blur(14px);
|
||||||
border-bottom: 1px solid var(--border);
|
border-bottom: 1px solid var(--border);
|
||||||
position: sticky;
|
position: sticky;
|
||||||
top: 0;
|
top: 0;
|
||||||
@@ -64,6 +71,7 @@ body {
|
|||||||
font-weight: 700;
|
font-weight: 700;
|
||||||
color: var(--text-bright);
|
color: var(--text-bright);
|
||||||
letter-spacing: 0.15em;
|
letter-spacing: 0.15em;
|
||||||
|
text-shadow: 0 0 18px rgba(168, 85, 247, 0.55), 0 0 40px rgba(168, 85, 247, 0.25);
|
||||||
}
|
}
|
||||||
.mc-subtitle {
|
.mc-subtitle {
|
||||||
font-size: 11px;
|
font-size: 11px;
|
||||||
@@ -73,8 +81,9 @@ body {
|
|||||||
}
|
}
|
||||||
.mc-time {
|
.mc-time {
|
||||||
font-size: 14px;
|
font-size: 14px;
|
||||||
color: var(--blue);
|
color: var(--orange);
|
||||||
letter-spacing: 0.1em;
|
letter-spacing: 0.1em;
|
||||||
|
text-shadow: 0 0 10px rgba(249, 115, 22, 0.4);
|
||||||
}
|
}
|
||||||
.mc-test-link {
|
.mc-test-link {
|
||||||
font-size: 9px;
|
font-size: 9px;
|
||||||
@@ -88,13 +97,13 @@ body {
|
|||||||
transition: border-color 0.15s, color 0.15s;
|
transition: border-color 0.15s, color 0.15s;
|
||||||
touch-action: manipulation;
|
touch-action: manipulation;
|
||||||
}
|
}
|
||||||
.mc-test-link:hover { border-color: var(--blue); color: var(--blue); }
|
.mc-test-link:hover { border-color: var(--purple); color: var(--purple); }
|
||||||
|
|
||||||
/* ── Main layout ─────────────────────────────────── */
|
/* ── Main layout ─────────────────────────────────── */
|
||||||
.mc-main {
|
.mc-main {
|
||||||
padding: 16px;
|
padding: 16px;
|
||||||
height: calc(100dvh - var(--header-h));
|
height: calc(100dvh - var(--header-h));
|
||||||
overflow: clip; /* clip = visual clipping only, no scroll container; lets trackpad events reach scrollable children */
|
overflow: clip;
|
||||||
}
|
}
|
||||||
.mc-content {
|
.mc-content {
|
||||||
height: 100%;
|
height: 100%;
|
||||||
@@ -106,7 +115,7 @@ body {
|
|||||||
/* ── Sidebar ─────────────────────────────────────── */
|
/* ── Sidebar ─────────────────────────────────────── */
|
||||||
.mc-sidebar {
|
.mc-sidebar {
|
||||||
overflow-y: auto;
|
overflow-y: auto;
|
||||||
min-height: 0; /* allow flex item to shrink so overflow-y: auto actually triggers */
|
min-height: 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
/* ── Chat column ─────────────────────────────────── */
|
/* ── Chat column ─────────────────────────────────── */
|
||||||
@@ -115,17 +124,19 @@ body {
|
|||||||
}
|
}
|
||||||
.mc-chat-panel > .card {
|
.mc-chat-panel > .card {
|
||||||
height: 100%;
|
height: 100%;
|
||||||
overflow: clip; /* visual clip only, preserves scroll events to .chat-log child */
|
overflow: clip;
|
||||||
}
|
}
|
||||||
|
|
||||||
/* ── Panel / Card overrides ──────────────────────── */
|
/* ── Panel / Card overrides ──────────────────────── */
|
||||||
.mc-panel {
|
.mc-panel {
|
||||||
background: var(--bg-panel);
|
background: rgba(17, 8, 32, 0.78);
|
||||||
|
backdrop-filter: blur(8px);
|
||||||
|
-webkit-backdrop-filter: blur(8px);
|
||||||
border: 1px solid var(--border);
|
border: 1px solid var(--border);
|
||||||
border-radius: 4px;
|
border-radius: 4px;
|
||||||
}
|
}
|
||||||
.mc-panel-header {
|
.mc-panel-header {
|
||||||
background: var(--bg-card);
|
background: rgba(24, 10, 45, 0.90);
|
||||||
border-bottom: 1px solid var(--border);
|
border-bottom: 1px solid var(--border);
|
||||||
font-size: 10px;
|
font-size: 10px;
|
||||||
font-weight: 700;
|
font-weight: 700;
|
||||||
@@ -140,7 +151,7 @@ body {
|
|||||||
border: 1px solid var(--border);
|
border: 1px solid var(--border);
|
||||||
border-radius: 3px;
|
border-radius: 3px;
|
||||||
padding: 12px;
|
padding: 12px;
|
||||||
background: var(--bg-card);
|
background: rgba(24, 10, 45, 0.82);
|
||||||
}
|
}
|
||||||
.status-dot {
|
.status-dot {
|
||||||
width: 8px;
|
width: 8px;
|
||||||
@@ -175,7 +186,7 @@ body {
|
|||||||
.health-row:last-child { border-bottom: none; }
|
.health-row:last-child { border-bottom: none; }
|
||||||
.health-label { color: var(--text-dim); letter-spacing: 0.08em; }
|
.health-label { color: var(--text-dim); letter-spacing: 0.08em; }
|
||||||
|
|
||||||
/* Status badges (use Bootstrap .badge base + mc-badge-* modifier) */
|
/* Status badges */
|
||||||
.mc-badge-up { background: var(--green-dim) !important; color: var(--green) !important; font-size: 10px; letter-spacing: 0.12em; border-radius: 2px; }
|
.mc-badge-up { background: var(--green-dim) !important; color: var(--green) !important; font-size: 10px; letter-spacing: 0.12em; border-radius: 2px; }
|
||||||
.mc-badge-down { background: var(--red-dim) !important; color: var(--red) !important; font-size: 10px; letter-spacing: 0.12em; border-radius: 2px; }
|
.mc-badge-down { background: var(--red-dim) !important; color: var(--red) !important; font-size: 10px; letter-spacing: 0.12em; border-radius: 2px; }
|
||||||
.mc-badge-ready { background: var(--amber-dim) !important; color: var(--amber) !important; font-size: 10px; letter-spacing: 0.12em; border-radius: 2px; }
|
.mc-badge-ready { background: var(--amber-dim) !important; color: var(--amber) !important; font-size: 10px; letter-spacing: 0.12em; border-radius: 2px; }
|
||||||
@@ -193,12 +204,12 @@ body {
|
|||||||
margin-bottom: 4px;
|
margin-bottom: 4px;
|
||||||
letter-spacing: 0.12em;
|
letter-spacing: 0.12em;
|
||||||
}
|
}
|
||||||
.chat-message.user .msg-meta { color: var(--blue); }
|
.chat-message.user .msg-meta { color: var(--orange); }
|
||||||
.chat-message.agent .msg-meta { color: var(--green); }
|
.chat-message.agent .msg-meta { color: var(--purple); }
|
||||||
.chat-message.error-msg .msg-meta { color: var(--red); }
|
.chat-message.error-msg .msg-meta { color: var(--red); }
|
||||||
|
|
||||||
.msg-body {
|
.msg-body {
|
||||||
background: var(--bg-card);
|
background: rgba(24, 10, 45, 0.80);
|
||||||
border: 1px solid var(--border);
|
border: 1px solid var(--border);
|
||||||
border-radius: 3px;
|
border-radius: 3px;
|
||||||
padding: 10px 12px;
|
padding: 10px 12px;
|
||||||
@@ -207,14 +218,14 @@ body {
|
|||||||
word-break: break-word;
|
word-break: break-word;
|
||||||
}
|
}
|
||||||
.chat-message.user .msg-body { border-color: var(--border-glow); }
|
.chat-message.user .msg-body { border-color: var(--border-glow); }
|
||||||
.chat-message.agent .msg-body { border-left: 3px solid var(--green); }
|
.chat-message.agent .msg-body { border-left: 3px solid var(--purple); }
|
||||||
.chat-message.error-msg .msg-body { border-left: 3px solid var(--red); color: var(--red); }
|
.chat-message.error-msg .msg-body { border-left: 3px solid var(--red); color: var(--red); }
|
||||||
|
|
||||||
/* ── Chat input footer ───────────────────────────── */
|
/* ── Chat input footer ───────────────────────────── */
|
||||||
.mc-chat-footer {
|
.mc-chat-footer {
|
||||||
padding: 12px 14px;
|
padding: 12px 14px;
|
||||||
padding-bottom: max(12px, env(safe-area-inset-bottom));
|
padding-bottom: max(12px, env(safe-area-inset-bottom));
|
||||||
background: var(--bg-card);
|
background: rgba(24, 10, 45, 0.90);
|
||||||
border-top: 1px solid var(--border);
|
border-top: 1px solid var(--border);
|
||||||
flex-shrink: 0;
|
flex-shrink: 0;
|
||||||
}
|
}
|
||||||
@@ -237,7 +248,7 @@ body {
|
|||||||
|
|
||||||
/* Bootstrap form-control overrides */
|
/* Bootstrap form-control overrides */
|
||||||
.mc-input {
|
.mc-input {
|
||||||
background: var(--bg-deep) !important;
|
background: rgba(8, 4, 18, 0.75) !important;
|
||||||
border: 1px solid var(--border) !important;
|
border: 1px solid var(--border) !important;
|
||||||
border-radius: 3px !important;
|
border-radius: 3px !important;
|
||||||
color: var(--text-bright) !important;
|
color: var(--text-bright) !important;
|
||||||
@@ -246,7 +257,7 @@ body {
|
|||||||
}
|
}
|
||||||
.mc-input:focus {
|
.mc-input:focus {
|
||||||
border-color: var(--border-glow) !important;
|
border-color: var(--border-glow) !important;
|
||||||
box-shadow: 0 0 0 1px var(--border-glow) !important;
|
box-shadow: 0 0 0 1px var(--border-glow), 0 0 10px rgba(124, 58, 237, 0.25) !important;
|
||||||
}
|
}
|
||||||
.mc-input::placeholder { color: var(--text-dim) !important; }
|
.mc-input::placeholder { color: var(--text-dim) !important; }
|
||||||
|
|
||||||
@@ -260,11 +271,15 @@ body {
|
|||||||
font-weight: 700;
|
font-weight: 700;
|
||||||
padding: 8px 18px;
|
padding: 8px 18px;
|
||||||
letter-spacing: 0.12em;
|
letter-spacing: 0.12em;
|
||||||
transition: background 0.15s, color 0.15s;
|
transition: background 0.15s, color 0.15s, box-shadow 0.15s;
|
||||||
touch-action: manipulation;
|
touch-action: manipulation;
|
||||||
white-space: nowrap;
|
white-space: nowrap;
|
||||||
}
|
}
|
||||||
.mc-btn-send:hover { background: var(--blue); color: var(--bg-deep); }
|
.mc-btn-send:hover {
|
||||||
|
background: var(--orange);
|
||||||
|
color: #080412;
|
||||||
|
box-shadow: 0 0 14px rgba(249, 115, 22, 0.45);
|
||||||
|
}
|
||||||
|
|
||||||
/* ── HTMX Loading ────────────────────────────────── */
|
/* ── HTMX Loading ────────────────────────────────── */
|
||||||
.htmx-indicator { display: none; }
|
.htmx-indicator { display: none; }
|
||||||
@@ -274,7 +289,7 @@ body {
|
|||||||
|
|
||||||
/* ── Scrollbar ───────────────────────────────────── */
|
/* ── Scrollbar ───────────────────────────────────── */
|
||||||
::-webkit-scrollbar { width: 4px; }
|
::-webkit-scrollbar { width: 4px; }
|
||||||
::-webkit-scrollbar-track { background: var(--bg-deep); }
|
::-webkit-scrollbar-track { background: transparent; }
|
||||||
::-webkit-scrollbar-thumb { background: var(--border); border-radius: 2px; }
|
::-webkit-scrollbar-thumb { background: var(--border); border-radius: 2px; }
|
||||||
::-webkit-scrollbar-thumb:hover { background: var(--border-glow); }
|
::-webkit-scrollbar-thumb:hover { background: var(--border-glow); }
|
||||||
|
|
||||||
|
|||||||
@@ -34,6 +34,23 @@ def reset_message_log():
|
|||||||
message_log.clear()
|
message_log.clear()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(autouse=True)
|
||||||
|
def reset_coordinator_state():
|
||||||
|
"""Clear the coordinator's in-memory state between tests.
|
||||||
|
|
||||||
|
The coordinator singleton is created at import time and persists across
|
||||||
|
the test session. Without this fixture, agents spawned in one test bleed
|
||||||
|
into the next through the auctions dict, comms listeners, and the
|
||||||
|
in-process node list.
|
||||||
|
"""
|
||||||
|
yield
|
||||||
|
from swarm.coordinator import coordinator
|
||||||
|
coordinator.auctions._auctions.clear()
|
||||||
|
coordinator.comms._listeners.clear()
|
||||||
|
coordinator._in_process_nodes.clear()
|
||||||
|
coordinator.manager.stop_all()
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def client():
|
def client():
|
||||||
from dashboard.app import app
|
from dashboard.app import app
|
||||||
|
|||||||
@@ -15,7 +15,8 @@ def test_index_contains_title(client):
|
|||||||
|
|
||||||
def test_index_contains_chat_interface(client):
|
def test_index_contains_chat_interface(client):
|
||||||
response = client.get("/")
|
response = client.get("/")
|
||||||
assert "TIMMY INTERFACE" in response.text
|
# Timmy panel loads dynamically via HTMX; verify the trigger attribute is present
|
||||||
|
assert "hx-get=\"/agents/timmy/panel\"" in response.text
|
||||||
|
|
||||||
|
|
||||||
# ── Health ────────────────────────────────────────────────────────────────────
|
# ── Health ────────────────────────────────────────────────────────────────────
|
||||||
|
|||||||
@@ -30,6 +30,11 @@ def _index_html(client) -> str:
|
|||||||
return client.get("/").text
|
return client.get("/").text
|
||||||
|
|
||||||
|
|
||||||
|
def _timmy_panel_html(client) -> str:
|
||||||
|
"""Fetch the Timmy chat panel (loaded dynamically from index via HTMX)."""
|
||||||
|
return client.get("/agents/timmy/panel").text
|
||||||
|
|
||||||
|
|
||||||
# ── M1xx — Viewport & meta tags ───────────────────────────────────────────────
|
# ── M1xx — Viewport & meta tags ───────────────────────────────────────────────
|
||||||
|
|
||||||
def test_M101_viewport_meta_present(client):
|
def test_M101_viewport_meta_present(client):
|
||||||
@@ -120,25 +125,25 @@ def test_M301_input_font_size_16px_in_mobile_query():
|
|||||||
|
|
||||||
def test_M302_input_autocapitalize_none(client):
|
def test_M302_input_autocapitalize_none(client):
|
||||||
"""autocapitalize=none prevents iOS from capitalising chat commands."""
|
"""autocapitalize=none prevents iOS from capitalising chat commands."""
|
||||||
html = _index_html(client)
|
html = _timmy_panel_html(client)
|
||||||
assert 'autocapitalize="none"' in html
|
assert 'autocapitalize="none"' in html
|
||||||
|
|
||||||
|
|
||||||
def test_M303_input_autocorrect_off(client):
|
def test_M303_input_autocorrect_off(client):
|
||||||
"""autocorrect=off prevents iOS from mangling technical / proper-noun input."""
|
"""autocorrect=off prevents iOS from mangling technical / proper-noun input."""
|
||||||
html = _index_html(client)
|
html = _timmy_panel_html(client)
|
||||||
assert 'autocorrect="off"' in html
|
assert 'autocorrect="off"' in html
|
||||||
|
|
||||||
|
|
||||||
def test_M304_input_enterkeyhint_send(client):
|
def test_M304_input_enterkeyhint_send(client):
|
||||||
"""enterkeyhint=send labels the iOS return key 'Send' for clearer UX."""
|
"""enterkeyhint=send labels the iOS return key 'Send' for clearer UX."""
|
||||||
html = _index_html(client)
|
html = _timmy_panel_html(client)
|
||||||
assert 'enterkeyhint="send"' in html
|
assert 'enterkeyhint="send"' in html
|
||||||
|
|
||||||
|
|
||||||
def test_M305_input_spellcheck_false(client):
|
def test_M305_input_spellcheck_false(client):
|
||||||
"""spellcheck=false prevents red squiggles on technical terms."""
|
"""spellcheck=false prevents red squiggles on technical terms."""
|
||||||
html = _index_html(client)
|
html = _timmy_panel_html(client)
|
||||||
assert 'spellcheck="false"' in html
|
assert 'spellcheck="false"' in html
|
||||||
|
|
||||||
|
|
||||||
@@ -146,19 +151,19 @@ def test_M305_input_spellcheck_false(client):
|
|||||||
|
|
||||||
def test_M401_form_hx_sync_drop(client):
|
def test_M401_form_hx_sync_drop(client):
|
||||||
"""hx-sync=this:drop discards duplicate submissions (fast double-tap)."""
|
"""hx-sync=this:drop discards duplicate submissions (fast double-tap)."""
|
||||||
html = _index_html(client)
|
html = _timmy_panel_html(client)
|
||||||
assert 'hx-sync="this:drop"' in html
|
assert 'hx-sync="this:drop"' in html
|
||||||
|
|
||||||
|
|
||||||
def test_M402_form_hx_disabled_elt(client):
|
def test_M402_form_hx_disabled_elt(client):
|
||||||
"""hx-disabled-elt disables the SEND button while a request is in-flight."""
|
"""hx-disabled-elt disables the SEND button while a request is in-flight."""
|
||||||
html = _index_html(client)
|
html = _timmy_panel_html(client)
|
||||||
assert "hx-disabled-elt" in html
|
assert "hx-disabled-elt" in html
|
||||||
|
|
||||||
|
|
||||||
def test_M403_form_hx_indicator(client):
|
def test_M403_form_hx_indicator(client):
|
||||||
"""hx-indicator wires up the loading spinner to the in-flight state."""
|
"""hx-indicator wires up the loading spinner to the in-flight state."""
|
||||||
html = _index_html(client)
|
html = _timmy_panel_html(client)
|
||||||
assert "hx-indicator" in html
|
assert "hx-indicator" in html
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
179
tests/test_swarm_recovery.py
Normal file
179
tests/test_swarm_recovery.py
Normal file
@@ -0,0 +1,179 @@
|
|||||||
|
"""Tests for swarm.recovery — startup reconciliation logic."""
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(autouse=True)
|
||||||
|
def tmp_swarm_db(tmp_path, monkeypatch):
|
||||||
|
"""Isolate SQLite writes to a temp directory."""
|
||||||
|
db = tmp_path / "swarm.db"
|
||||||
|
monkeypatch.setattr("swarm.tasks.DB_PATH", db)
|
||||||
|
monkeypatch.setattr("swarm.registry.DB_PATH", db)
|
||||||
|
monkeypatch.setattr("swarm.stats.DB_PATH", db)
|
||||||
|
yield db
|
||||||
|
|
||||||
|
|
||||||
|
# ── reconcile_on_startup: return shape ───────────────────────────────────────
|
||||||
|
|
||||||
|
def test_reconcile_returns_summary_keys():
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
result = reconcile_on_startup()
|
||||||
|
assert "tasks_failed" in result
|
||||||
|
assert "agents_offlined" in result
|
||||||
|
|
||||||
|
|
||||||
|
def test_reconcile_empty_db_returns_zeros():
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
result = reconcile_on_startup()
|
||||||
|
assert result["tasks_failed"] == 0
|
||||||
|
assert result["agents_offlined"] == 0
|
||||||
|
|
||||||
|
|
||||||
|
# ── Orphaned task rescue ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_reconcile_fails_bidding_task():
|
||||||
|
from swarm.tasks import create_task, get_task, update_task, TaskStatus
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
|
||||||
|
task = create_task("Orphaned bidding task")
|
||||||
|
update_task(task.id, status=TaskStatus.BIDDING)
|
||||||
|
|
||||||
|
result = reconcile_on_startup()
|
||||||
|
|
||||||
|
assert result["tasks_failed"] == 1
|
||||||
|
rescued = get_task(task.id)
|
||||||
|
assert rescued.status == TaskStatus.FAILED
|
||||||
|
assert rescued.result is not None
|
||||||
|
assert rescued.completed_at is not None
|
||||||
|
|
||||||
|
|
||||||
|
def test_reconcile_fails_running_task():
|
||||||
|
from swarm.tasks import create_task, get_task, update_task, TaskStatus
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
|
||||||
|
task = create_task("Orphaned running task")
|
||||||
|
update_task(task.id, status=TaskStatus.RUNNING)
|
||||||
|
|
||||||
|
result = reconcile_on_startup()
|
||||||
|
assert result["tasks_failed"] == 1
|
||||||
|
assert get_task(task.id).status == TaskStatus.FAILED
|
||||||
|
|
||||||
|
|
||||||
|
def test_reconcile_fails_assigned_task():
|
||||||
|
from swarm.tasks import create_task, get_task, update_task, TaskStatus
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
|
||||||
|
task = create_task("Orphaned assigned task")
|
||||||
|
update_task(task.id, status=TaskStatus.ASSIGNED, assigned_agent="agent-x")
|
||||||
|
|
||||||
|
result = reconcile_on_startup()
|
||||||
|
assert result["tasks_failed"] == 1
|
||||||
|
assert get_task(task.id).status == TaskStatus.FAILED
|
||||||
|
|
||||||
|
|
||||||
|
def test_reconcile_leaves_pending_task_untouched():
|
||||||
|
from swarm.tasks import create_task, get_task, TaskStatus
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
|
||||||
|
task = create_task("Pending task — should survive")
|
||||||
|
# status is PENDING by default
|
||||||
|
reconcile_on_startup()
|
||||||
|
assert get_task(task.id).status == TaskStatus.PENDING
|
||||||
|
|
||||||
|
|
||||||
|
def test_reconcile_leaves_completed_task_untouched():
|
||||||
|
from swarm.tasks import create_task, update_task, get_task, TaskStatus
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
|
||||||
|
task = create_task("Completed task")
|
||||||
|
update_task(task.id, status=TaskStatus.COMPLETED, result="done")
|
||||||
|
|
||||||
|
reconcile_on_startup()
|
||||||
|
assert get_task(task.id).status == TaskStatus.COMPLETED
|
||||||
|
|
||||||
|
|
||||||
|
def test_reconcile_counts_multiple_orphans():
|
||||||
|
from swarm.tasks import create_task, update_task, TaskStatus
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
|
||||||
|
for status in (TaskStatus.BIDDING, TaskStatus.RUNNING, TaskStatus.ASSIGNED):
|
||||||
|
t = create_task(f"Orphan {status}")
|
||||||
|
update_task(t.id, status=status)
|
||||||
|
|
||||||
|
result = reconcile_on_startup()
|
||||||
|
assert result["tasks_failed"] == 3
|
||||||
|
|
||||||
|
|
||||||
|
# ── Stale agent offlined ──────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_reconcile_offlines_idle_agent():
|
||||||
|
from swarm import registry
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
|
||||||
|
agent = registry.register("IdleAgent")
|
||||||
|
assert agent.status == "idle"
|
||||||
|
|
||||||
|
result = reconcile_on_startup()
|
||||||
|
assert result["agents_offlined"] == 1
|
||||||
|
assert registry.get_agent(agent.id).status == "offline"
|
||||||
|
|
||||||
|
|
||||||
|
def test_reconcile_offlines_busy_agent():
|
||||||
|
from swarm import registry
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
|
||||||
|
agent = registry.register("BusyAgent")
|
||||||
|
registry.update_status(agent.id, "busy")
|
||||||
|
|
||||||
|
result = reconcile_on_startup()
|
||||||
|
assert result["agents_offlined"] == 1
|
||||||
|
assert registry.get_agent(agent.id).status == "offline"
|
||||||
|
|
||||||
|
|
||||||
|
def test_reconcile_skips_already_offline_agent():
|
||||||
|
from swarm import registry
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
|
||||||
|
agent = registry.register("OfflineAgent")
|
||||||
|
registry.update_status(agent.id, "offline")
|
||||||
|
|
||||||
|
result = reconcile_on_startup()
|
||||||
|
assert result["agents_offlined"] == 0
|
||||||
|
|
||||||
|
|
||||||
|
def test_reconcile_counts_multiple_stale_agents():
|
||||||
|
from swarm import registry
|
||||||
|
from swarm.recovery import reconcile_on_startup
|
||||||
|
|
||||||
|
registry.register("AgentA")
|
||||||
|
registry.register("AgentB")
|
||||||
|
registry.register("AgentC")
|
||||||
|
|
||||||
|
result = reconcile_on_startup()
|
||||||
|
assert result["agents_offlined"] == 3
|
||||||
|
|
||||||
|
|
||||||
|
# ── Coordinator integration ───────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_coordinator_runs_recovery_on_init():
|
||||||
|
"""Coordinator.__init__ calls reconcile; _recovery_summary must be present."""
|
||||||
|
from swarm.coordinator import SwarmCoordinator
|
||||||
|
coord = SwarmCoordinator()
|
||||||
|
assert hasattr(coord, "_recovery_summary")
|
||||||
|
assert "tasks_failed" in coord._recovery_summary
|
||||||
|
assert "agents_offlined" in coord._recovery_summary
|
||||||
|
coord.manager.stop_all()
|
||||||
|
|
||||||
|
|
||||||
|
def test_coordinator_recovery_cleans_stale_task():
|
||||||
|
"""End-to-end: task left in BIDDING is cleaned up by a fresh coordinator."""
|
||||||
|
from swarm.tasks import create_task, get_task, update_task, TaskStatus
|
||||||
|
from swarm.coordinator import SwarmCoordinator
|
||||||
|
|
||||||
|
task = create_task("Stale bidding task")
|
||||||
|
update_task(task.id, status=TaskStatus.BIDDING)
|
||||||
|
|
||||||
|
coord = SwarmCoordinator()
|
||||||
|
assert get_task(task.id).status == TaskStatus.FAILED
|
||||||
|
assert coord._recovery_summary["tasks_failed"] >= 1
|
||||||
|
coord.manager.stop_all()
|
||||||
Reference in New Issue
Block a user