Compare commits
1 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
173ce54eed |
@@ -14,7 +14,7 @@ jobs:
|
||||
- name: Parse check
|
||||
run: |
|
||||
find . -name '*.yml' -o -name '*.yaml' | grep -v .gitea | xargs -r python3 -c "import sys,yaml; [yaml.safe_load(open(f)) for f in sys.argv[1:]]"
|
||||
find . -name '*.json' -print0 | xargs -0 -r -n1 python3 -m json.tool > /dev/null
|
||||
find . -name '*.json' | xargs -r python3 -m json.tool > /dev/null
|
||||
find . -name '*.py' | xargs -r python3 -m py_compile
|
||||
find . -name '*.sh' | xargs -r bash -n
|
||||
echo "PASS: All files parse"
|
||||
@@ -22,6 +22,3 @@ jobs:
|
||||
run: |
|
||||
if grep -rE 'sk-or-|sk-ant-|ghp_|AKIA' . --include='*.yml' --include='*.py' --include='*.sh' 2>/dev/null | grep -v '.gitea' | grep -v 'detect_secrets' | grep -v 'test_trajectory_sanitize'; then exit 1; fi
|
||||
echo "PASS: No secrets"
|
||||
- name: Backup pipeline regression test
|
||||
run: |
|
||||
python3 -m unittest discover -s tests -p 'test_backup_pipeline.py' -v
|
||||
|
||||
741
GENOME.md
Normal file
741
GENOME.md
Normal file
@@ -0,0 +1,741 @@
|
||||
# GENOME.md — timmy-home
|
||||
|
||||
Generated: 2026-04-15 04:23:24Z
|
||||
Issue: #670
|
||||
Branch: `fix/670`
|
||||
|
||||
## Project Overview
|
||||
|
||||
`timmy-home` is not a single product application. It is the Timmy Foundation workspace repo: an operational monorepo that mixes live guardrails, fleet scripts, local-world experiments, training pipelines, research artifacts, notes, and prototype agent runtimes.
|
||||
|
||||
The most important reality check is in `OPERATIONS.md`: the active production system is Hermes plus the `timmy-config` sidecar, while `timmy-home` functions as the workspace, proving ground, and artifact store.
|
||||
|
||||
### What the repo is good at
|
||||
|
||||
- local operational utilities and proof-oriented runbooks
|
||||
- offline-first data transforms for training and archive analysis
|
||||
- experimental Evennia / world-shell work
|
||||
- prototype autonomous harnesses (`uniwizard/`, `uni-wizard/`)
|
||||
- game-agent experiments (`morrowind/`, Tower simulations)
|
||||
- telemetry, reports, and training-data staging
|
||||
|
||||
### What the repo is not
|
||||
|
||||
- not a clean single-purpose package
|
||||
- not a consistently production-hardened deployable service
|
||||
- not one coherent architecture; it is several architectures sharing one tree
|
||||
|
||||
### Repository shape
|
||||
|
||||
Metric snapshot from `/tmp/BURN-7-6`:
|
||||
|
||||
- 3,119 files total
|
||||
- 234 source files (`.py`, `.sh`, `.js`)
|
||||
- 29 test files
|
||||
- 21 config files
|
||||
- 397,683 text lines total
|
||||
- 42,428 Python lines
|
||||
- 2,403 shell lines
|
||||
- 272,829 Markdown lines
|
||||
|
||||
The file count is dominated by data and documentation:
|
||||
|
||||
- `training-data/` → 2,013 files
|
||||
- `wizards/` → 345 files
|
||||
- `skills/` → 327 files
|
||||
|
||||
This matters: operational code is only a small fraction of the repo. Any analysis that treats the repo as “just a Python package” will be wrong.
|
||||
|
||||
## Runtime Reality
|
||||
|
||||
The current live-system contract is defined more accurately by `OPERATIONS.md` than by `README.md`.
|
||||
|
||||
- `README.md` documents secret scanning and a tiny subset of the repo.
|
||||
- `OPERATIONS.md` states that the active system is Hermes + `timmy-config` sidecar.
|
||||
- `CONTRIBUTING.md` defines the repo’s most important invariant: proof is required for merge.
|
||||
|
||||
That means `timmy-home` behaves as a workspace + runbook + experiment garden around a live sidecar/orchestrator that mostly lives elsewhere.
|
||||
|
||||
## Architecture
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
subgraph Inputs
|
||||
A[Gitea / Forge issues]
|
||||
B[Hermes sessions and local state]
|
||||
C[Twitter archive + media]
|
||||
D[OpenMW / Evennia runtime state]
|
||||
E[Local machine + VPS fleet]
|
||||
end
|
||||
|
||||
subgraph Ops_Workspace
|
||||
F[scripts/\nops, proof, hygiene, reports]
|
||||
G[tests/ + .gitea/workflows/\nverification surface]
|
||||
H[OPERATIONS.md / CONTRIBUTING.md\nrunbook + merge gate]
|
||||
end
|
||||
|
||||
subgraph World_and_Agent_Runtimes
|
||||
I[evennia/ + scripts/evennia/\nlocal world lane]
|
||||
J[timmy-local/\ncache + in-world commands]
|
||||
K[morrowind/\nOpenMW MCP + local brain]
|
||||
L[uniwizard/ + uni-wizard/\nrouting and harness experiments]
|
||||
end
|
||||
|
||||
subgraph Analysis_and_Training
|
||||
M[scripts/twitter_archive/\narchive extraction + media analysis]
|
||||
N[scripts/know_thy_father/\nmeaning kernels + crossref]
|
||||
O[evennia_tools/ + metrics/\ntelemetry + sovereignty accounting]
|
||||
P[training-data/ reports/ briefings/\nartifacts]
|
||||
end
|
||||
|
||||
A --> F
|
||||
B --> F
|
||||
B --> O
|
||||
C --> M
|
||||
C --> N
|
||||
D --> I
|
||||
D --> J
|
||||
D --> K
|
||||
E --> F
|
||||
E --> L
|
||||
|
||||
F --> G
|
||||
H --> G
|
||||
I --> O
|
||||
J --> O
|
||||
K --> P
|
||||
L --> P
|
||||
M --> P
|
||||
N --> P
|
||||
O --> P
|
||||
```
|
||||
|
||||
## Major Domains
|
||||
|
||||
### 1. Proof and operational guardrails
|
||||
|
||||
Key files:
|
||||
|
||||
- `README.md`
|
||||
- `OPERATIONS.md`
|
||||
- `CONTRIBUTING.md`
|
||||
- `.gitea/workflows/smoke.yml`
|
||||
- `scripts/detect_secrets.py`
|
||||
- `scripts/trajectory_sanitize.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- define what counts as valid proof
|
||||
- protect the repo from obvious secret leaks
|
||||
- sanitize training/session artifacts before reuse
|
||||
- provide a minimal smoke gate in CI
|
||||
|
||||
This is the most stable and best-tested area of the repo.
|
||||
|
||||
### 2. Evennia / world-shell lane
|
||||
|
||||
Key files:
|
||||
|
||||
- `scripts/evennia/bootstrap_local_evennia.py`
|
||||
- `scripts/evennia/verify_local_evennia.py`
|
||||
- `scripts/evennia/evennia_mcp_server.py`
|
||||
- `evennia/timmy_world/...`
|
||||
- `timmy-local/evennia/...`
|
||||
- `evennia_tools/layout.py`
|
||||
- `evennia_tools/telemetry.py`
|
||||
- `evennia_tools/training.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- bootstrap and verify a local Evennia runtime
|
||||
- expose Evennia via MCP/telnet tooling
|
||||
- model Timmy as a room-based world shell with tool-bridging commands
|
||||
- collect telemetry and training traces from world interactions
|
||||
|
||||
Reality:
|
||||
|
||||
- there is a mostly stock Evennia scaffold under `evennia/timmy_world/`
|
||||
- there is a richer but incomplete prototype under `timmy-local/evennia/`
|
||||
- the two are conceptually aligned but not fully unified
|
||||
|
||||
### 3. Tower / narrative simulation lane
|
||||
|
||||
Key files:
|
||||
|
||||
- `scripts/tower_game.py`
|
||||
- `timmy-world/game.py`
|
||||
- `evennia/timmy_world/game.py`
|
||||
- `evennia/timmy_world/world/game.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- simulate Timmy’s narrative “Tower” world
|
||||
- track rooms, trust, phases, energy, and dialogue
|
||||
- serve as an emergence / world-state experimentation lane
|
||||
|
||||
Reality:
|
||||
|
||||
- `scripts/tower_game.py` is the cleanest and best-tested version
|
||||
- three larger Tower variants exist in parallel, creating drift and maintenance risk
|
||||
|
||||
### 4. Twitter archive / Know Thy Father lane
|
||||
|
||||
Key files:
|
||||
|
||||
- `scripts/twitter_archive/extract_archive.py`
|
||||
- `scripts/twitter_archive/extract_media_manifest.py`
|
||||
- `scripts/twitter_archive/analyze_media.py`
|
||||
- `scripts/twitter_archive/build_dpo_pairs.py`
|
||||
- `scripts/know_thy_father/index_media.py`
|
||||
- `scripts/know_thy_father/synthesize_kernels.py`
|
||||
- `scripts/know_thy_father/crossref_audit.py`
|
||||
- `twitter-archive/know-thy-father/tracker.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- normalize raw archive exports
|
||||
- build media manifests and hashtag metrics
|
||||
- derive analysis artifacts and “meaning kernels”
|
||||
- compare extracted themes against Timmy’s declared principles
|
||||
- generate DPO / training-ready artifacts
|
||||
|
||||
This is one of the strongest parts of the repo in terms of tests, but it also has multiple overlapping generations of pipeline code.
|
||||
|
||||
### 5. Harness / routing experiments
|
||||
|
||||
Key files:
|
||||
|
||||
- `uniwizard/task_classifier.py`
|
||||
- `uniwizard/quality_scorer.py`
|
||||
- `uniwizard/self_grader.py`
|
||||
- `uni-wizard/harness.py`
|
||||
- `uni-wizard/daemons/task_router.py`
|
||||
- `uni-wizard/daemons/health_daemon.py`
|
||||
- `uni-wizard/v2/...`
|
||||
- `uni-wizard/v3/...`
|
||||
- `uni-wizard/v4/...`
|
||||
|
||||
Purpose:
|
||||
|
||||
- classify prompts and route them to candidate backends
|
||||
- record backend quality over time
|
||||
- grade Hermes sessions and generate self-improvement signals
|
||||
- prototype unified local-first agent harnesses
|
||||
- explore multi-house routing, telemetry, and adaptation engines
|
||||
|
||||
Reality:
|
||||
|
||||
- `uniwizard/` is cleaner and more testable
|
||||
- `uni-wizard/` contains several architecture generations in one tree
|
||||
- versioned tests expose namespace/import collisions that currently break full-repo pytest
|
||||
|
||||
### 6. Game-agent / local reflex lane
|
||||
|
||||
Key files:
|
||||
|
||||
- `morrowind/mcp_server.py`
|
||||
- `morrowind/local_brain.py`
|
||||
- `morrowind/pilot.py`
|
||||
- `morrowind/play.py`
|
||||
- `morrowind/agent.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- drive OpenMW locally through perception + keystroke automation
|
||||
- expose game actions through MCP tools
|
||||
- compare deterministic and model-driven control loops
|
||||
- save trajectories for training or DPO later
|
||||
|
||||
This lane is high-side-effect and essentially untested.
|
||||
|
||||
### 7. Telemetry, bridge, and sovereignty accounting
|
||||
|
||||
Key files:
|
||||
|
||||
- `metrics/model_tracker.py`
|
||||
- `infrastructure/timmy-bridge/client/timmy_client.py`
|
||||
- `infrastructure/timmy-bridge/monitor/timmy_monitor.py`
|
||||
- `infrastructure/timmy-bridge/reports/generate_report.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- track local-vs-cloud usage and rough cost avoidance
|
||||
- relay state/artifacts through a bridge architecture
|
||||
- record telemetry and generate retrospective reports
|
||||
|
||||
Reality:
|
||||
|
||||
- structurally useful
|
||||
- but parts of the bridge and crypto story are explicitly placeholder-grade
|
||||
|
||||
## Entry Points
|
||||
|
||||
There is no single canonical `main.py`. The repo has many entry points, grouped by lane.
|
||||
|
||||
### Repo-level operational entry points
|
||||
|
||||
- `scripts/detect_secrets.py`
|
||||
- secret scanning CLI and pre-commit helper
|
||||
- `scripts/trajectory_sanitize.py`
|
||||
- artifact sanitization CLI for JSON / JSONL session data
|
||||
- `scripts/fleet_milestones.py`
|
||||
- milestone-trigger state machine and logger
|
||||
- `scripts/failover_monitor.py`
|
||||
- simple host reachability recorder
|
||||
- `.gitea/workflows/smoke.yml`
|
||||
- CI smoke entry point for parse checks and grep-based secret scan
|
||||
|
||||
### Evennia / world entry points
|
||||
|
||||
- `scripts/evennia/bootstrap_local_evennia.py`
|
||||
- local world bootstrap and provisioning
|
||||
- `scripts/evennia/verify_local_evennia.py`
|
||||
- runtime verification via HTTP, shell, telnet
|
||||
- `scripts/evennia/evennia_mcp_server.py`
|
||||
- MCP bridge into the Evennia runtime
|
||||
- `timmy-local/evennia/world/build.py`
|
||||
- creates rooms, exits, tool objects, and Timmy character state
|
||||
|
||||
### Archive / research entry points
|
||||
|
||||
- `scripts/twitter_archive/extract_archive.py`
|
||||
- `scripts/twitter_archive/extract_media_manifest.py`
|
||||
- `scripts/twitter_archive/analyze_media.py`
|
||||
- `scripts/twitter_archive/build_dpo_pairs.py`
|
||||
- `scripts/know_thy_father/index_media.py`
|
||||
- `scripts/know_thy_father/synthesize_kernels.py`
|
||||
- `scripts/know_thy_father/crossref_audit.py`
|
||||
|
||||
### Harness / agent entry points
|
||||
|
||||
- `uni-wizard/harness.py`
|
||||
- `uni-wizard/daemons/task_router.py`
|
||||
- `uni-wizard/daemons/health_daemon.py`
|
||||
- `uniwizard/self_grader.py`
|
||||
- `uniwizard/quality_scorer.py`
|
||||
- `uniwizard/task_classifier.py`
|
||||
|
||||
### Game-agent entry points
|
||||
|
||||
- `morrowind/mcp_server.py`
|
||||
- `morrowind/local_brain.py`
|
||||
- `morrowind/pilot.py`
|
||||
- `morrowind/play.py`
|
||||
|
||||
## Data Flows
|
||||
|
||||
### 1. Proof / hygiene flow
|
||||
|
||||
1. Developer changes repo files.
|
||||
2. `scripts/detect_secrets.py` scans content for obvious leaks.
|
||||
3. `CONTRIBUTING.md` requires exact proof artifacts, not hand-wavy claims.
|
||||
4. `.gitea/workflows/smoke.yml` attempts parse checks in CI.
|
||||
5. Sanitized session artifacts can be passed through `scripts/trajectory_sanitize.py` before reuse.
|
||||
|
||||
### 2. Archive-to-training flow
|
||||
|
||||
1. Raw archive export enters `scripts/twitter_archive/extract_archive.py`.
|
||||
2. Media-bearing rows are indexed by `extract_media_manifest.py`.
|
||||
3. `analyze_media.py` attaches batch analysis output and kernels.
|
||||
4. `scripts/know_thy_father/*.py` transforms archive slices into meaning kernels and conscience cross-references.
|
||||
5. Results land in `training-data/`, reports, and summary artifacts.
|
||||
|
||||
### 3. Evennia world flow
|
||||
|
||||
1. `bootstrap_local_evennia.py` prepares local runtime state.
|
||||
2. `timmy-local/evennia/world/build.py` constructs rooms, exits, and Timmy’s in-world state.
|
||||
3. `scripts/evennia/evennia_mcp_server.py` bridges runtime control to Hermes/MCP.
|
||||
4. `evennia_tools/telemetry.py` writes JSONL event streams and metadata.
|
||||
5. Example traces are stored under `~/.timmy/training-data/evennia` and repo examples.
|
||||
|
||||
### 4. Harness / routing flow
|
||||
|
||||
1. Prompt text enters `uniwizard/task_classifier.py`.
|
||||
2. Candidate backends are ranked by type and complexity.
|
||||
3. `uniwizard/quality_scorer.py` records observed backend performance.
|
||||
4. `uniwizard/self_grader.py` parses Hermes sessions and grades task quality.
|
||||
5. `uni-wizard/` branches this idea into increasingly ambitious task routers, telemetry systems, and adaptation engines.
|
||||
|
||||
### 5. Morrowind agent flow
|
||||
|
||||
1. OpenMW state is parsed from logs or screenshots.
|
||||
2. `morrowind/local_brain.py` or `pilot.py` builds a local control prompt.
|
||||
3. A local model or deterministic motor layer selects an action.
|
||||
4. `morrowind/mcp_server.py` or local automation injects keys/actions.
|
||||
5. Trajectories are logged under `~/.timmy/morrowind/`.
|
||||
|
||||
## Key Abstractions
|
||||
|
||||
### `TowerGame` / `GameState` / `Phase` / `Room`
|
||||
|
||||
File: `scripts/tower_game.py`
|
||||
|
||||
- the cleanest narrative-engine abstraction in the repo
|
||||
- models room state, trust, energy, narrative phase, dialogue, and monologue
|
||||
- heavily covered by `tests/test_tower_game.py`
|
||||
|
||||
### `TimmyCharacter` and `TimmyRoom`
|
||||
|
||||
Files:
|
||||
|
||||
- `timmy-local/evennia/typeclasses/characters.py`
|
||||
- `timmy-local/evennia/typeclasses/rooms.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- represent Timmy as a persistent Evennia character with task, knowledge, tool, preference, and metric state
|
||||
- model rooms as functional workspaces like Workshop, Library, Observatory, Forge, and Dispatch
|
||||
|
||||
### In-world tool commands
|
||||
|
||||
File: `timmy-local/evennia/commands/tools.py`
|
||||
|
||||
Key commands:
|
||||
|
||||
- `read`, `write`, `search`
|
||||
- `git_status`, `git_log`, `git_pull`
|
||||
- `sysinfo`, `health`
|
||||
- `think`
|
||||
- `gitea_issues`
|
||||
- room navigation commands
|
||||
|
||||
This is the repo’s most direct “world shell as operations interface” abstraction.
|
||||
|
||||
### `UniWizardHarness`
|
||||
|
||||
Files:
|
||||
|
||||
- `uni-wizard/harness.py`
|
||||
- `uni-wizard/v2/harness.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- unify system, git, network, and file-style actions behind one harness surface
|
||||
- later versions add provenance, house routing, telemetry, and policy logic
|
||||
|
||||
The downside is namespace drift across versions.
|
||||
|
||||
### `TaskClassifier` and `ClassificationResult`
|
||||
|
||||
File: `uniwizard/task_classifier.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- classify prompts by type and complexity
|
||||
- recommend backend order for execution
|
||||
|
||||
This is a central abstraction in the repo’s routing experiments.
|
||||
|
||||
### `PatternDatabase` / `IntelligenceEngine`
|
||||
|
||||
File: `uni-wizard/v3/intelligence_engine.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- store execution patterns and model performance
|
||||
- derive adaptation events and predictions
|
||||
|
||||
This is a prototype intelligence layer, not yet a clean production subsystem.
|
||||
|
||||
### `MeaningKernel`
|
||||
|
||||
File: `scripts/know_thy_father/synthesize_kernels.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- normalize archive/media findings into structured semantic outputs
|
||||
- feed downstream summaries and fact-style exports
|
||||
|
||||
### Evennia telemetry helpers
|
||||
|
||||
File: `evennia_tools/telemetry.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- produce deterministic session-day directories, event log paths, metadata files, and append-only telemetry streams
|
||||
|
||||
### `TimmyClient`
|
||||
|
||||
File: `infrastructure/timmy-bridge/client/timmy_client.py`
|
||||
|
||||
Purpose:
|
||||
|
||||
- send heartbeat/artifact state toward the bridge/relay system
|
||||
|
||||
This is important structurally, but still partly demo-grade in implementation quality.
|
||||
|
||||
## API Surface
|
||||
|
||||
## CLI commands
|
||||
|
||||
### Proof / hygiene
|
||||
|
||||
```bash
|
||||
python3 scripts/detect_secrets.py <paths>
|
||||
python3 scripts/trajectory_sanitize.py --input <in> --output <out>
|
||||
python3 scripts/fleet_milestones.py --list
|
||||
python3 scripts/fleet_milestones.py --trigger <milestone>
|
||||
python3 scripts/failover_monitor.py
|
||||
```
|
||||
|
||||
### Evennia
|
||||
|
||||
```bash
|
||||
python3 scripts/evennia/bootstrap_local_evennia.py
|
||||
python3 scripts/evennia/verify_local_evennia.py
|
||||
python3 scripts/evennia/evennia_mcp_server.py
|
||||
python3 scripts/evennia/eval_world_basics.py
|
||||
python3 scripts/evennia/generate_sample_trace.py
|
||||
```
|
||||
|
||||
### Archive / research
|
||||
|
||||
```bash
|
||||
python3 scripts/twitter_archive/extract_archive.py ...
|
||||
python3 scripts/twitter_archive/extract_media_manifest.py ...
|
||||
python3 scripts/twitter_archive/analyze_media.py --status
|
||||
python3 scripts/twitter_archive/analyze_media.py --batch <n>
|
||||
python3 scripts/know_thy_father/index_media.py ...
|
||||
python3 scripts/know_thy_father/synthesize_kernels.py ...
|
||||
python3 scripts/know_thy_father/crossref_audit.py ...
|
||||
```
|
||||
|
||||
### Harness / routing
|
||||
|
||||
```bash
|
||||
python3 uniwizard/self_grader.py ...
|
||||
python3 uniwizard/quality_scorer.py ...
|
||||
python3 uniwizard/task_classifier.py ...
|
||||
python3 uni-wizard/harness.py ...
|
||||
python3 uni-wizard/daemons/task_router.py ...
|
||||
python3 uni-wizard/daemons/health_daemon.py ...
|
||||
```
|
||||
|
||||
### Game-agent
|
||||
|
||||
```bash
|
||||
python3 morrowind/mcp_server.py
|
||||
python3 morrowind/local_brain.py
|
||||
python3 morrowind/pilot.py
|
||||
python3 morrowind/play.py
|
||||
```
|
||||
|
||||
## In-world commands
|
||||
|
||||
Documented in `timmy-local/evennia/commands/tools.py`:
|
||||
|
||||
- `read <path>`
|
||||
- `write <path> = <content>`
|
||||
- `search <pattern>`
|
||||
- `git status`
|
||||
- `git log [n]`
|
||||
- `git pull`
|
||||
- `sysinfo`
|
||||
- `health`
|
||||
- `think <prompt>`
|
||||
- `gitea issues`
|
||||
- movement/status commands for Workshop, Library, Observatory
|
||||
|
||||
## Test Coverage
|
||||
|
||||
### Current verified state
|
||||
|
||||
Commands run on this branch:
|
||||
|
||||
- `pytest -q tests/test_fleet_milestones.py tests/test_failover_monitor.py` → `7 passed`
|
||||
- `pytest -q tests` → `150 passed, 19 warnings`
|
||||
- `pytest -q` → `260 passed, 4 failed, 45 warnings`
|
||||
|
||||
### Tests added in this branch
|
||||
|
||||
- `tests/test_fleet_milestones.py`
|
||||
- covers state persistence, dry-run behavior, unknown milestones, idempotence
|
||||
- `tests/test_failover_monitor.py`
|
||||
- covers online/offline detection and status-file emission
|
||||
|
||||
These close two concrete gaps around previously untested operational scripts.
|
||||
|
||||
### Well-covered areas
|
||||
|
||||
- secret detection
|
||||
- trajectory sanitization
|
||||
- Tower game (`scripts/tower_game.py`)
|
||||
- Know Thy Father pipeline
|
||||
- Twitter archive pipeline
|
||||
- Evennia telemetry/layout helper modules
|
||||
- documentation proof-policy assertions
|
||||
|
||||
### Weak or missing coverage
|
||||
|
||||
- no real end-to-end coverage for `scripts/evennia/bootstrap_local_evennia.py`
|
||||
- no runtime coverage for `scripts/evennia/verify_local_evennia.py` against a real Evennia world
|
||||
- no automated tests for `morrowind/`
|
||||
- no meaningful automated coverage for `infrastructure/timmy-bridge/`
|
||||
- most shell/provisioning scripts remain untested
|
||||
- `uni-wizard/v2` and `uni-wizard/v3` are not full-repo-pytest clean
|
||||
|
||||
### Concrete failing area
|
||||
|
||||
Full-repo pytest currently fails in `uni-wizard/v2/tests/test_author_whitelist.py` because `uni-wizard/v2/task_router_daemon.py` imports `harness` and resolves to `uni-wizard/harness.py` instead of the version-local v2 harness. That is a real namespace collision, not flaky test noise.
|
||||
|
||||
## Security Considerations
|
||||
|
||||
### 1. Hard-coded IPs and host-specific paths
|
||||
|
||||
Examples appear across the repo, including:
|
||||
|
||||
- `timmy-local/README.md`
|
||||
- `scripts/setup-uni-wizard.sh`
|
||||
- `scripts/provision-timmy-vps.sh`
|
||||
- `scripts/emacs-fleet-poll.sh`
|
||||
- `scripts/emacs-fleet-bridge.py`
|
||||
- `scripts/failover_monitor.py`
|
||||
|
||||
Risk:
|
||||
|
||||
- fragile deploy assumptions
|
||||
- easy environment drift
|
||||
- accidental disclosure in public docs or configs
|
||||
|
||||
### 2. Broad mutation surfaces in world-shell commands
|
||||
|
||||
`timmy-local/evennia/commands/tools.py` exposes host file, git, search, system, and network operations inside the world abstraction.
|
||||
|
||||
Risk:
|
||||
|
||||
- large blast radius if command permissions are weak
|
||||
- hard to separate playful world interaction from privileged host mutation
|
||||
|
||||
### 3. CI is smoke-only and partly broken
|
||||
|
||||
`.gitea/workflows/smoke.yml` does not run pytest.
|
||||
|
||||
Worse, its JSON parse command is broken as written:
|
||||
|
||||
```bash
|
||||
find . -name '*.json' | xargs -r python3 -m json.tool > /dev/null
|
||||
```
|
||||
|
||||
`python -m json.tool` does not accept that many positional files in one invocation, so the smoke workflow can fail or mislead.
|
||||
|
||||
### 4. Placeholder-grade crypto / bridge logic
|
||||
|
||||
`infrastructure/timmy-bridge/` contains meaningful structure, but parts of the relay/client story are still simplified or placeholder quality.
|
||||
|
||||
Risk:
|
||||
|
||||
- implementation may look more production-ready than it is
|
||||
- trust assumptions may exceed actual cryptographic guarantees
|
||||
|
||||
### 5. Offensive tooling co-located with ops code
|
||||
|
||||
The `wizards/.../red-teaming/godmode/...` lane contains explicit jailbreak automation assets.
|
||||
|
||||
Risk:
|
||||
|
||||
- policy confusion
|
||||
- accidental inclusion in downstream automation
|
||||
- higher review burden for a repo that also houses operational infrastructure
|
||||
|
||||
### 6. Deprecated UTC calls
|
||||
|
||||
Multiple tested modules still use `datetime.utcnow()` and emit deprecation warnings.
|
||||
|
||||
Risk:
|
||||
|
||||
- future Python compatibility debt
|
||||
- noisy test output that can hide more important warnings
|
||||
|
||||
## Dependencies and External Assumptions
|
||||
|
||||
### Python / system
|
||||
|
||||
- Python 3.11 in CI
|
||||
- `pytest`
|
||||
- `PyYAML` for YAML smoke parsing
|
||||
- `sqlite3`
|
||||
- shell utilities like `bash`, `ping`, `grep`, `xargs`
|
||||
|
||||
### Local-runtime assumptions
|
||||
|
||||
- Hermes local session/state under `~/.hermes/`
|
||||
- Timmy local state under `~/.timmy/`
|
||||
- Evennia runtime availability for world-shell scripts
|
||||
- local inference endpoints (llama.cpp / Ollama / localhost services)
|
||||
- `ffmpeg` / `ffprobe` for media analysis paths
|
||||
- OpenMW / Apple automation for `morrowind/`
|
||||
- SSH/systemd availability for fleet scripts
|
||||
|
||||
## Deployment and Operability
|
||||
|
||||
The most important deploy fact is this:
|
||||
|
||||
- live orchestration is described in `OPERATIONS.md` as Hermes + `timmy-config` sidecar
|
||||
- `timmy-home` supplies workspace scripts, experiments, runbooks, and artifacts around that live system
|
||||
|
||||
Practical run surfaces:
|
||||
|
||||
- CI smoke: `.gitea/workflows/smoke.yml`
|
||||
- local tests: `pytest -q tests`
|
||||
- Evennia world lane: `scripts/evennia/*.py`
|
||||
- archive lane: `scripts/twitter_archive/*.py`, `scripts/know_thy_father/*.py`
|
||||
- local-world experiments: `timmy-local/`
|
||||
- harness experiments: `uniwizard/`, `uni-wizard/`
|
||||
|
||||
## Duplication and Drift Candidates
|
||||
|
||||
### High-confidence duplication
|
||||
|
||||
- `scripts/tower_game.py`
|
||||
- `timmy-world/game.py`
|
||||
- `evennia/timmy_world/game.py`
|
||||
- `evennia/timmy_world/world/game.py`
|
||||
|
||||
These are overlapping Tower implementations with different behavior and shared concepts.
|
||||
|
||||
### Architecture-generation drift
|
||||
|
||||
- `uniwizard/`
|
||||
- `uni-wizard/`
|
||||
- `uni-wizard/v2/`
|
||||
- `uni-wizard/v3/`
|
||||
- `uni-wizard/v4/`
|
||||
|
||||
Multiple generations coexist with conflicting import assumptions.
|
||||
|
||||
### Pipeline overlap
|
||||
|
||||
- `twitter-archive/`
|
||||
- `scripts/twitter_archive/`
|
||||
- `scripts/know_thy_father/`
|
||||
|
||||
These lanes overlap in mission and artifact shape.
|
||||
|
||||
## Performance and Scaling Notes
|
||||
|
||||
- most heavy data volume lives in `training-data/`, so repo-wide file scans are expensive by default
|
||||
- smoke commands that blindly walk all JSON files will age poorly as artifact volume grows
|
||||
- archive/media pipelines depend on batch processing and checkpointing to remain tractable
|
||||
- routing/telemetry systems rely heavily on local SQLite and JSONL append-only files, which is simple and inspectable but may become contention-prone under sustained automation
|
||||
|
||||
## Recommended Follow-up Work
|
||||
|
||||
1. Fix `.gitea/workflows/smoke.yml` so JSON validation iterates file-by-file and pytest is part of CI. Filed as issue #715.
|
||||
2. Resolve `uni-wizard` namespace collisions so `pytest -q` is green repo-wide. Filed as issue #716.
|
||||
3. Decide which Tower implementation is canonical and retire the others or clearly mark them experimental.
|
||||
4. Separate production-grade bridge/runtime code from placeholder or speculative prototypes.
|
||||
5. Centralize host/path/IP configuration instead of embedding machine-specific values in docs and scripts.
|
||||
6. Add end-to-end verification for the Evennia runtime lane.
|
||||
7. Add at least smoke coverage for the `morrowind/` and `timmy-bridge/` lanes.
|
||||
|
||||
## Verification Notes
|
||||
|
||||
This GENOME is based on:
|
||||
|
||||
- direct inspection of the repo root, top-level metrics, and key runtime docs
|
||||
- direct test execution on this branch
|
||||
- direct reproduction of the broken CI JSON-parse command
|
||||
- targeted new tests added for untested operational scripts
|
||||
- deeper file-level analysis across Evennia, archive, harness, and game-agent lanes
|
||||
|
||||
It should be read as a map of a workspace monorepo with several real subsystems and several prototype subsystems, not as documentation for one singular deployable app.
|
||||
@@ -1,98 +0,0 @@
|
||||
# Encrypted Hermes Backup Pipeline
|
||||
|
||||
Issue: `timmy-home#693`
|
||||
|
||||
This pipeline creates a nightly encrypted archive of `~/.hermes`, stores a local encrypted copy, uploads it to remote storage, and supports restore verification.
|
||||
|
||||
## What gets backed up
|
||||
|
||||
By default the pipeline archives:
|
||||
|
||||
- `~/.hermes/config.yaml`
|
||||
- `~/.hermes/state.db`
|
||||
- `~/.hermes/sessions/`
|
||||
- `~/.hermes/cron/`
|
||||
- any other files under `~/.hermes`
|
||||
|
||||
Override the source with `BACKUP_SOURCE_DIR=/path/to/.hermes`.
|
||||
|
||||
## Backup command
|
||||
|
||||
```bash
|
||||
BACKUP_PASSPHRASE_FILE=~/.config/timmy/backup.passphrase \
|
||||
BACKUP_NAS_TARGET=/Volumes/timmy-nas/hermes-backups \
|
||||
bash scripts/backup_pipeline.sh
|
||||
```
|
||||
|
||||
The script writes:
|
||||
|
||||
- local encrypted copy: `~/.timmy-backups/hermes/<timestamp>/hermes-backup-<timestamp>.tar.gz.enc`
|
||||
- local manifest: `~/.timmy-backups/hermes/<timestamp>/hermes-backup-<timestamp>.json`
|
||||
- log file: `~/.timmy-backups/hermes/logs/backup_pipeline.log`
|
||||
|
||||
## Nightly schedule
|
||||
|
||||
Run every night at 03:00:
|
||||
|
||||
```cron
|
||||
0 3 * * * cd /Users/apayne/.timmy/timmy-home && BACKUP_PASSPHRASE_FILE=/Users/apayne/.config/timmy/backup.passphrase BACKUP_NAS_TARGET=/Volumes/timmy-nas/hermes-backups bash scripts/backup_pipeline.sh >> /Users/apayne/.timmy-backups/hermes/logs/cron.log 2>&1
|
||||
```
|
||||
|
||||
## Remote targets
|
||||
|
||||
At least one remote target must be configured.
|
||||
|
||||
### Local NAS
|
||||
|
||||
Use a mounted path:
|
||||
|
||||
```bash
|
||||
BACKUP_NAS_TARGET=/Volumes/timmy-nas/hermes-backups
|
||||
```
|
||||
|
||||
The pipeline copies the encrypted archive and manifest into `<BACKUP_NAS_TARGET>/<timestamp>/`.
|
||||
|
||||
### S3-compatible storage
|
||||
|
||||
```bash
|
||||
BACKUP_PASSPHRASE_FILE=~/.config/timmy/backup.passphrase \
|
||||
BACKUP_S3_URI=s3://timmy-backups/hermes \
|
||||
AWS_ENDPOINT_URL=https://minio.example.com \
|
||||
bash scripts/backup_pipeline.sh
|
||||
```
|
||||
|
||||
Notes:
|
||||
|
||||
- `aws` CLI must be installed if `BACKUP_S3_URI` is set.
|
||||
- `AWS_ENDPOINT_URL` is optional and is used for MinIO, R2, and other S3-compatible endpoints.
|
||||
|
||||
## Restore playbook
|
||||
|
||||
Restore an encrypted archive into a clean target root:
|
||||
|
||||
```bash
|
||||
BACKUP_PASSPHRASE_FILE=~/.config/timmy/backup.passphrase \
|
||||
bash scripts/restore_backup.sh \
|
||||
/Volumes/timmy-nas/hermes-backups/20260415-030000/hermes-backup-20260415-030000.tar.gz.enc \
|
||||
/tmp/hermes-restore
|
||||
```
|
||||
|
||||
Result:
|
||||
|
||||
- restored tree lands at `/tmp/hermes-restore/.hermes`
|
||||
- if a sibling manifest exists, the restore script verifies the archive SHA256 before decrypting
|
||||
|
||||
## End-to-end verification
|
||||
|
||||
Run the regression suite:
|
||||
|
||||
```bash
|
||||
python3 -m unittest discover -s tests -p 'test_backup_pipeline.py' -v
|
||||
```
|
||||
|
||||
This proves:
|
||||
|
||||
1. the backup output is encrypted
|
||||
2. plaintext archives do not leak into the backup destinations
|
||||
3. the restore script recreates the original `.hermes` tree end-to-end
|
||||
4. the pipeline refuses to run without a remote target
|
||||
@@ -12,8 +12,6 @@ Quick-reference index for common operational tasks across the Timmy Foundation i
|
||||
| Check fleet health | fleet-ops | `python3 scripts/fleet_readiness.py` |
|
||||
| Agent scorecard | fleet-ops | `python3 scripts/agent_scorecard.py` |
|
||||
| View fleet manifest | fleet-ops | `cat manifest.yaml` |
|
||||
| Backup Hermes state | timmy-home | `BACKUP_PASSPHRASE_FILE=... BACKUP_NAS_TARGET=... bash scripts/backup_pipeline.sh` |
|
||||
| Restore Hermes state | timmy-home | `BACKUP_PASSPHRASE_FILE=... bash scripts/restore_backup.sh <archive> <restore-root>` |
|
||||
|
||||
## the-nexus (Frontend + Brain)
|
||||
|
||||
|
||||
@@ -1,170 +1,80 @@
|
||||
#!/usr/bin/env bash
|
||||
# backup_pipeline.sh — Nightly encrypted Hermes backup pipeline
|
||||
# Refs: timmy-home #693, timmy-home #561
|
||||
# backup_pipeline.sh — Daily fleet backup pipeline (FLEET-008)
|
||||
# Refs: timmy-home #561
|
||||
set -euo pipefail
|
||||
|
||||
DATESTAMP="${BACKUP_TIMESTAMP:-$(date +%Y%m%d-%H%M%S)}"
|
||||
BACKUP_SOURCE_DIR="${BACKUP_SOURCE_DIR:-${HOME}/.hermes}"
|
||||
BACKUP_ROOT="${BACKUP_ROOT:-${HOME}/.timmy-backups/hermes}"
|
||||
BACKUP_LOG_DIR="${BACKUP_LOG_DIR:-${BACKUP_ROOT}/logs}"
|
||||
BACKUP_RETENTION_DAYS="${BACKUP_RETENTION_DAYS:-14}"
|
||||
BACKUP_S3_URI="${BACKUP_S3_URI:-}"
|
||||
BACKUP_NAS_TARGET="${BACKUP_NAS_TARGET:-}"
|
||||
AWS_ENDPOINT_URL="${AWS_ENDPOINT_URL:-}"
|
||||
BACKUP_NAME="hermes-backup-${DATESTAMP}"
|
||||
LOCAL_BACKUP_DIR="${BACKUP_ROOT}/${DATESTAMP}"
|
||||
STAGE_DIR="$(mktemp -d "${TMPDIR:-/tmp}/timmy-backup.XXXXXX")"
|
||||
PLAINTEXT_ARCHIVE="${STAGE_DIR}/${BACKUP_NAME}.tar.gz"
|
||||
ENCRYPTED_ARCHIVE="${STAGE_DIR}/${BACKUP_NAME}.tar.gz.enc"
|
||||
MANIFEST_PATH="${STAGE_DIR}/${BACKUP_NAME}.json"
|
||||
ALERT_LOG="${BACKUP_LOG_DIR}/backup_pipeline.log"
|
||||
PASSFILE_CLEANUP=""
|
||||
BACKUP_ROOT="/backups/timmy"
|
||||
DATESTAMP=$(date +%Y%m%d-%H%M%S)
|
||||
BACKUP_DIR="${BACKUP_ROOT}/${DATESTAMP}"
|
||||
LOG_DIR="/var/log/timmy"
|
||||
ALERT_LOG="${LOG_DIR}/backup_pipeline.log"
|
||||
mkdir -p "$BACKUP_DIR" "$LOG_DIR"
|
||||
|
||||
mkdir -p "$BACKUP_LOG_DIR"
|
||||
TELEGRAM_BOT_TOKEN="${TELEGRAM_BOT_TOKEN:-}"
|
||||
TELEGRAM_CHAT_ID="${TELEGRAM_CHAT_ID:-}"
|
||||
OFFSITE_TARGET="${OFFSITE_TARGET:-}"
|
||||
|
||||
log() {
|
||||
echo "[$(date -Iseconds)] $1" | tee -a "$ALERT_LOG"
|
||||
}
|
||||
log() { echo "[$(date -Iseconds)] $1" | tee -a "$ALERT_LOG"; }
|
||||
|
||||
fail() {
|
||||
log "ERROR: $1"
|
||||
exit 1
|
||||
}
|
||||
|
||||
cleanup() {
|
||||
rm -f "$PLAINTEXT_ARCHIVE"
|
||||
rm -rf "$STAGE_DIR"
|
||||
if [[ -n "$PASSFILE_CLEANUP" && -f "$PASSFILE_CLEANUP" ]]; then
|
||||
rm -f "$PASSFILE_CLEANUP"
|
||||
send_telegram() {
|
||||
local msg="$1"
|
||||
if [[ -n "$TELEGRAM_BOT_TOKEN" && -n "$TELEGRAM_CHAT_ID" ]]; then
|
||||
curl -s -X POST "https://api.telegram.org/bot${TELEGRAM_BOT_TOKEN}/sendMessage" \
|
||||
-d "chat_id=${TELEGRAM_CHAT_ID}" -d "text=${msg}" >/dev/null 2>&1 || true
|
||||
fi
|
||||
}
|
||||
trap cleanup EXIT
|
||||
|
||||
resolve_passphrase_file() {
|
||||
if [[ -n "${BACKUP_PASSPHRASE_FILE:-}" ]]; then
|
||||
[[ -f "$BACKUP_PASSPHRASE_FILE" ]] || fail "BACKUP_PASSPHRASE_FILE does not exist: $BACKUP_PASSPHRASE_FILE"
|
||||
echo "$BACKUP_PASSPHRASE_FILE"
|
||||
return
|
||||
status=0
|
||||
|
||||
# --- Gitea repositories ---
|
||||
if [[ -d /root/gitea ]]; then
|
||||
tar czf "${BACKUP_DIR}/gitea-repos.tar.gz" -C /root gitea 2>/dev/null || true
|
||||
log "Backed up Gitea repos"
|
||||
fi
|
||||
|
||||
# --- Agent configs and state ---
|
||||
for wiz in bezalel allegro ezra timmy; do
|
||||
if [[ -d "/root/wizards/${wiz}" ]]; then
|
||||
tar czf "${BACKUP_DIR}/${wiz}-home.tar.gz" -C /root/wizards "${wiz}" 2>/dev/null || true
|
||||
log "Backed up ${wiz} home"
|
||||
fi
|
||||
done
|
||||
|
||||
if [[ -n "${BACKUP_PASSPHRASE:-}" ]]; then
|
||||
PASSFILE_CLEANUP="${STAGE_DIR}/backup.passphrase"
|
||||
printf '%s' "$BACKUP_PASSPHRASE" > "$PASSFILE_CLEANUP"
|
||||
chmod 600 "$PASSFILE_CLEANUP"
|
||||
echo "$PASSFILE_CLEANUP"
|
||||
return
|
||||
fi
|
||||
# --- System configs ---
|
||||
cp /etc/crontab "${BACKUP_DIR}/crontab" 2>/dev/null || true
|
||||
cp -r /etc/systemd/system "${BACKUP_DIR}/systemd" 2>/dev/null || true
|
||||
log "Backed up system configs"
|
||||
|
||||
fail "Set BACKUP_PASSPHRASE_FILE or BACKUP_PASSPHRASE before running the backup pipeline."
|
||||
}
|
||||
# --- Evennia worlds (if present) ---
|
||||
if [[ -d /root/evennia ]]; then
|
||||
tar czf "${BACKUP_DIR}/evennia-worlds.tar.gz" -C /root evennia 2>/dev/null || true
|
||||
log "Backed up Evennia worlds"
|
||||
fi
|
||||
|
||||
sha256_file() {
|
||||
local path="$1"
|
||||
if command -v shasum >/dev/null 2>&1; then
|
||||
shasum -a 256 "$path" | awk '{print $1}'
|
||||
elif command -v sha256sum >/dev/null 2>&1; then
|
||||
sha256sum "$path" | awk '{print $1}'
|
||||
# --- Manifest ---
|
||||
find "$BACKUP_DIR" -type f > "${BACKUP_DIR}/manifest.txt"
|
||||
log "Backup manifest written"
|
||||
|
||||
# --- Offsite sync ---
|
||||
if [[ -n "$OFFSITE_TARGET" ]]; then
|
||||
if rsync -az --delete "${BACKUP_DIR}/" "${OFFSITE_TARGET}/${DATESTAMP}/" 2>/dev/null; then
|
||||
log "Offsite sync completed"
|
||||
else
|
||||
python3 - <<'PY' "$path"
|
||||
import hashlib
|
||||
import pathlib
|
||||
import sys
|
||||
path = pathlib.Path(sys.argv[1])
|
||||
h = hashlib.sha256()
|
||||
with path.open('rb') as f:
|
||||
for chunk in iter(lambda: f.read(1024 * 1024), b''):
|
||||
h.update(chunk)
|
||||
print(h.hexdigest())
|
||||
PY
|
||||
log "WARNING: Offsite sync failed"
|
||||
status=1
|
||||
fi
|
||||
}
|
||||
|
||||
write_manifest() {
|
||||
python3 - <<'PY' "$1" "$2" "$3" "$4" "$5" "$6" "$7" "$8"
|
||||
import json
|
||||
import sys
|
||||
manifest_path, source_dir, archive_name, archive_sha256, local_dir, s3_uri, nas_target, created_at = sys.argv[1:]
|
||||
manifest = {
|
||||
"created_at": created_at,
|
||||
"source_dir": source_dir,
|
||||
"archive_name": archive_name,
|
||||
"archive_sha256": archive_sha256,
|
||||
"encryption": {
|
||||
"type": "openssl",
|
||||
"cipher": "aes-256-cbc",
|
||||
"pbkdf2": True,
|
||||
"iterations": 200000,
|
||||
},
|
||||
"destinations": {
|
||||
"local_dir": local_dir,
|
||||
"s3_uri": s3_uri or None,
|
||||
"nas_target": nas_target or None,
|
||||
},
|
||||
}
|
||||
with open(manifest_path, 'w', encoding='utf-8') as handle:
|
||||
json.dump(manifest, handle, indent=2)
|
||||
handle.write('\n')
|
||||
PY
|
||||
}
|
||||
|
||||
upload_to_nas() {
|
||||
local archive_path="$1"
|
||||
local manifest_path="$2"
|
||||
local target_root="$3"
|
||||
|
||||
local target_dir="${target_root%/}/${DATESTAMP}"
|
||||
mkdir -p "$target_dir"
|
||||
cp "$archive_path" "$manifest_path" "$target_dir/"
|
||||
log "Uploaded backup to NAS target: $target_dir"
|
||||
}
|
||||
|
||||
upload_to_s3() {
|
||||
local archive_path="$1"
|
||||
local manifest_path="$2"
|
||||
|
||||
command -v aws >/dev/null 2>&1 || fail "BACKUP_S3_URI is set but aws CLI is not installed."
|
||||
|
||||
local args=()
|
||||
if [[ -n "$AWS_ENDPOINT_URL" ]]; then
|
||||
args+=(--endpoint-url "$AWS_ENDPOINT_URL")
|
||||
fi
|
||||
|
||||
aws "${args[@]}" s3 cp "$archive_path" "${BACKUP_S3_URI%/}/$(basename "$archive_path")"
|
||||
aws "${args[@]}" s3 cp "$manifest_path" "${BACKUP_S3_URI%/}/$(basename "$manifest_path")"
|
||||
log "Uploaded backup to S3 target: $BACKUP_S3_URI"
|
||||
}
|
||||
|
||||
[[ -d "$BACKUP_SOURCE_DIR" ]] || fail "BACKUP_SOURCE_DIR does not exist: $BACKUP_SOURCE_DIR"
|
||||
[[ -n "$BACKUP_NAS_TARGET" || -n "$BACKUP_S3_URI" ]] || fail "Set BACKUP_NAS_TARGET or BACKUP_S3_URI for remote backup storage."
|
||||
|
||||
PASSFILE="$(resolve_passphrase_file)"
|
||||
mkdir -p "$LOCAL_BACKUP_DIR"
|
||||
|
||||
log "Creating archive from $BACKUP_SOURCE_DIR"
|
||||
tar -czf "$PLAINTEXT_ARCHIVE" -C "$(dirname "$BACKUP_SOURCE_DIR")" "$(basename "$BACKUP_SOURCE_DIR")"
|
||||
|
||||
log "Encrypting archive"
|
||||
openssl enc -aes-256-cbc -salt -pbkdf2 -iter 200000 \
|
||||
-pass "file:${PASSFILE}" \
|
||||
-in "$PLAINTEXT_ARCHIVE" \
|
||||
-out "$ENCRYPTED_ARCHIVE"
|
||||
|
||||
ARCHIVE_SHA256="$(sha256_file "$ENCRYPTED_ARCHIVE")"
|
||||
CREATED_AT="$(date -u '+%Y-%m-%dT%H:%M:%SZ')"
|
||||
write_manifest "$MANIFEST_PATH" "$BACKUP_SOURCE_DIR" "$(basename "$ENCRYPTED_ARCHIVE")" "$ARCHIVE_SHA256" "$LOCAL_BACKUP_DIR" "$BACKUP_S3_URI" "$BACKUP_NAS_TARGET" "$CREATED_AT"
|
||||
|
||||
cp "$ENCRYPTED_ARCHIVE" "$MANIFEST_PATH" "$LOCAL_BACKUP_DIR/"
|
||||
rm -f "$PLAINTEXT_ARCHIVE"
|
||||
log "Encrypted backup stored locally: ${LOCAL_BACKUP_DIR}/$(basename "$ENCRYPTED_ARCHIVE")"
|
||||
|
||||
if [[ -n "$BACKUP_NAS_TARGET" ]]; then
|
||||
upload_to_nas "$ENCRYPTED_ARCHIVE" "$MANIFEST_PATH" "$BACKUP_NAS_TARGET"
|
||||
fi
|
||||
|
||||
if [[ -n "$BACKUP_S3_URI" ]]; then
|
||||
upload_to_s3 "$ENCRYPTED_ARCHIVE" "$MANIFEST_PATH"
|
||||
# --- Retention: keep last 7 days ---
|
||||
find "$BACKUP_ROOT" -mindepth 1 -maxdepth 1 -type d -mtime +7 -exec rm -rf {} + 2>/dev/null || true
|
||||
log "Retention applied (7 days)"
|
||||
|
||||
if [[ "$status" -eq 0 ]]; then
|
||||
log "Backup pipeline completed: ${BACKUP_DIR}"
|
||||
send_telegram "✅ Daily backup completed: ${DATESTAMP}"
|
||||
else
|
||||
log "Backup pipeline completed with WARNINGS: ${BACKUP_DIR}"
|
||||
send_telegram "⚠️ Daily backup completed with warnings: ${DATESTAMP}"
|
||||
fi
|
||||
|
||||
find "$BACKUP_ROOT" -mindepth 1 -maxdepth 1 -type d -name '20*' -mtime "+${BACKUP_RETENTION_DAYS}" -exec rm -rf {} + 2>/dev/null || true
|
||||
log "Retention applied (${BACKUP_RETENTION_DAYS} days)"
|
||||
log "Backup pipeline completed successfully"
|
||||
exit "$status"
|
||||
|
||||
@@ -1,97 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
# restore_backup.sh — Restore an encrypted Hermes backup archive
|
||||
# Usage: restore_backup.sh /path/to/hermes-backup-YYYYmmdd-HHMMSS.tar.gz.enc /restore/root
|
||||
set -euo pipefail
|
||||
|
||||
ARCHIVE_PATH="${1:-}"
|
||||
RESTORE_ROOT="${2:-}"
|
||||
STAGE_DIR="$(mktemp -d "${TMPDIR:-/tmp}/timmy-restore.XXXXXX")"
|
||||
PLAINTEXT_ARCHIVE="${STAGE_DIR}/restore.tar.gz"
|
||||
PASSFILE_CLEANUP=""
|
||||
|
||||
cleanup() {
|
||||
rm -f "$PLAINTEXT_ARCHIVE"
|
||||
rm -rf "$STAGE_DIR"
|
||||
if [[ -n "$PASSFILE_CLEANUP" && -f "$PASSFILE_CLEANUP" ]]; then
|
||||
rm -f "$PASSFILE_CLEANUP"
|
||||
fi
|
||||
}
|
||||
trap cleanup EXIT
|
||||
|
||||
fail() {
|
||||
echo "ERROR: $1" >&2
|
||||
exit 1
|
||||
}
|
||||
|
||||
resolve_passphrase_file() {
|
||||
if [[ -n "${BACKUP_PASSPHRASE_FILE:-}" ]]; then
|
||||
[[ -f "$BACKUP_PASSPHRASE_FILE" ]] || fail "BACKUP_PASSPHRASE_FILE does not exist: $BACKUP_PASSPHRASE_FILE"
|
||||
echo "$BACKUP_PASSPHRASE_FILE"
|
||||
return
|
||||
fi
|
||||
|
||||
if [[ -n "${BACKUP_PASSPHRASE:-}" ]]; then
|
||||
PASSFILE_CLEANUP="${STAGE_DIR}/backup.passphrase"
|
||||
printf '%s' "$BACKUP_PASSPHRASE" > "$PASSFILE_CLEANUP"
|
||||
chmod 600 "$PASSFILE_CLEANUP"
|
||||
echo "$PASSFILE_CLEANUP"
|
||||
return
|
||||
fi
|
||||
|
||||
fail "Set BACKUP_PASSPHRASE_FILE or BACKUP_PASSPHRASE before restoring a backup."
|
||||
}
|
||||
|
||||
sha256_file() {
|
||||
local path="$1"
|
||||
if command -v shasum >/dev/null 2>&1; then
|
||||
shasum -a 256 "$path" | awk '{print $1}'
|
||||
elif command -v sha256sum >/dev/null 2>&1; then
|
||||
sha256sum "$path" | awk '{print $1}'
|
||||
else
|
||||
python3 - <<'PY' "$path"
|
||||
import hashlib
|
||||
import pathlib
|
||||
import sys
|
||||
path = pathlib.Path(sys.argv[1])
|
||||
h = hashlib.sha256()
|
||||
with path.open('rb') as f:
|
||||
for chunk in iter(lambda: f.read(1024 * 1024), b''):
|
||||
h.update(chunk)
|
||||
print(h.hexdigest())
|
||||
PY
|
||||
fi
|
||||
}
|
||||
|
||||
[[ -n "$ARCHIVE_PATH" ]] || fail "Usage: restore_backup.sh /path/to/archive.tar.gz.enc /restore/root"
|
||||
[[ -n "$RESTORE_ROOT" ]] || fail "Usage: restore_backup.sh /path/to/archive.tar.gz.enc /restore/root"
|
||||
[[ -f "$ARCHIVE_PATH" ]] || fail "Archive not found: $ARCHIVE_PATH"
|
||||
|
||||
if [[ "$ARCHIVE_PATH" == *.tar.gz.enc ]]; then
|
||||
MANIFEST_PATH="${ARCHIVE_PATH%.tar.gz.enc}.json"
|
||||
else
|
||||
MANIFEST_PATH=""
|
||||
fi
|
||||
|
||||
if [[ -n "$MANIFEST_PATH" && -f "$MANIFEST_PATH" ]]; then
|
||||
EXPECTED_SHA="$(python3 - <<'PY' "$MANIFEST_PATH"
|
||||
import json
|
||||
import sys
|
||||
with open(sys.argv[1], 'r', encoding='utf-8') as handle:
|
||||
manifest = json.load(handle)
|
||||
print(manifest['archive_sha256'])
|
||||
PY
|
||||
)"
|
||||
ACTUAL_SHA="$(sha256_file "$ARCHIVE_PATH")"
|
||||
[[ "$EXPECTED_SHA" == "$ACTUAL_SHA" ]] || fail "Archive SHA256 mismatch: expected $EXPECTED_SHA got $ACTUAL_SHA"
|
||||
fi
|
||||
|
||||
PASSFILE="$(resolve_passphrase_file)"
|
||||
mkdir -p "$RESTORE_ROOT"
|
||||
|
||||
openssl enc -d -aes-256-cbc -salt -pbkdf2 -iter 200000 \
|
||||
-pass "file:${PASSFILE}" \
|
||||
-in "$ARCHIVE_PATH" \
|
||||
-out "$PLAINTEXT_ARCHIVE"
|
||||
|
||||
tar -xzf "$PLAINTEXT_ARCHIVE" -C "$RESTORE_ROOT"
|
||||
echo "Restored backup into $RESTORE_ROOT"
|
||||
@@ -1,103 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import os
|
||||
import subprocess
|
||||
import tempfile
|
||||
import unittest
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
ROOT = Path(__file__).resolve().parents[1]
|
||||
BACKUP_SCRIPT = ROOT / "scripts" / "backup_pipeline.sh"
|
||||
RESTORE_SCRIPT = ROOT / "scripts" / "restore_backup.sh"
|
||||
|
||||
|
||||
class TestBackupPipeline(unittest.TestCase):
|
||||
def setUp(self) -> None:
|
||||
self.tempdir = tempfile.TemporaryDirectory()
|
||||
self.base = Path(self.tempdir.name)
|
||||
self.home = self.base / "home"
|
||||
self.source_dir = self.home / ".hermes"
|
||||
self.source_dir.mkdir(parents=True)
|
||||
(self.source_dir / "sessions").mkdir()
|
||||
(self.source_dir / "cron").mkdir()
|
||||
(self.source_dir / "config.yaml").write_text("model: local-first\n")
|
||||
(self.source_dir / "sessions" / "session.jsonl").write_text('{"role":"assistant","content":"hello"}\n')
|
||||
(self.source_dir / "cron" / "jobs.json").write_text('{"jobs": 1}\n')
|
||||
(self.source_dir / "state.db").write_bytes(b"sqlite-state")
|
||||
|
||||
self.backup_root = self.base / "backup-root"
|
||||
self.nas_target = self.base / "nas-target"
|
||||
self.restore_root = self.base / "restore-root"
|
||||
self.log_dir = self.base / "logs"
|
||||
self.passphrase_file = self.base / "backup.passphrase"
|
||||
self.passphrase_file.write_text("correct horse battery staple\n")
|
||||
|
||||
def tearDown(self) -> None:
|
||||
self.tempdir.cleanup()
|
||||
|
||||
def _env(self, *, include_remote: bool = True) -> dict[str, str]:
|
||||
env = os.environ.copy()
|
||||
env.update(
|
||||
{
|
||||
"HOME": str(self.home),
|
||||
"BACKUP_SOURCE_DIR": str(self.source_dir),
|
||||
"BACKUP_ROOT": str(self.backup_root),
|
||||
"BACKUP_LOG_DIR": str(self.log_dir),
|
||||
"BACKUP_PASSPHRASE_FILE": str(self.passphrase_file),
|
||||
}
|
||||
)
|
||||
if include_remote:
|
||||
env["BACKUP_NAS_TARGET"] = str(self.nas_target)
|
||||
return env
|
||||
|
||||
def test_backup_encrypts_and_restore_round_trips(self) -> None:
|
||||
backup = subprocess.run(
|
||||
["bash", str(BACKUP_SCRIPT)],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
env=self._env(),
|
||||
cwd=ROOT,
|
||||
)
|
||||
self.assertEqual(backup.returncode, 0, msg=backup.stdout + backup.stderr)
|
||||
|
||||
encrypted_archives = sorted(self.nas_target.rglob("*.tar.gz.enc"))
|
||||
self.assertEqual(len(encrypted_archives), 1, msg=f"expected one encrypted archive, found: {encrypted_archives}")
|
||||
archive_path = encrypted_archives[0]
|
||||
self.assertNotIn(b"model: local-first", archive_path.read_bytes())
|
||||
|
||||
manifests = sorted(self.nas_target.rglob("*.json"))
|
||||
self.assertEqual(len(manifests), 1, msg=f"expected one manifest, found: {manifests}")
|
||||
|
||||
plaintext_archives = sorted(self.backup_root.rglob("*.tar.gz")) + sorted(self.nas_target.rglob("*.tar.gz"))
|
||||
self.assertEqual(plaintext_archives, [], msg=f"plaintext archives leaked: {plaintext_archives}")
|
||||
|
||||
restore = subprocess.run(
|
||||
["bash", str(RESTORE_SCRIPT), str(archive_path), str(self.restore_root)],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
env=self._env(),
|
||||
cwd=ROOT,
|
||||
)
|
||||
self.assertEqual(restore.returncode, 0, msg=restore.stdout + restore.stderr)
|
||||
|
||||
restored_hermes = self.restore_root / ".hermes"
|
||||
self.assertTrue(restored_hermes.exists())
|
||||
self.assertEqual((restored_hermes / "config.yaml").read_text(), "model: local-first\n")
|
||||
self.assertEqual((restored_hermes / "sessions" / "session.jsonl").read_text(), '{"role":"assistant","content":"hello"}\n')
|
||||
self.assertEqual((restored_hermes / "cron" / "jobs.json").read_text(), '{"jobs": 1}\n')
|
||||
self.assertEqual((restored_hermes / "state.db").read_bytes(), b"sqlite-state")
|
||||
|
||||
def test_backup_requires_remote_target(self) -> None:
|
||||
backup = subprocess.run(
|
||||
["bash", str(BACKUP_SCRIPT)],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
env=self._env(include_remote=False),
|
||||
cwd=ROOT,
|
||||
)
|
||||
self.assertNotEqual(backup.returncode, 0)
|
||||
self.assertIn("BACKUP_NAS_TARGET or BACKUP_S3_URI", backup.stdout + backup.stderr)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main(verbosity=2)
|
||||
45
tests/test_failover_monitor.py
Normal file
45
tests/test_failover_monitor.py
Normal file
@@ -0,0 +1,45 @@
|
||||
import json
|
||||
import subprocess
|
||||
|
||||
from scripts import failover_monitor as monitor
|
||||
|
||||
|
||||
def test_check_health_reports_online(monkeypatch):
|
||||
def fake_check_call(cmd, stdout=None):
|
||||
assert cmd[:4] == ["ping", "-c", "1", "-W"]
|
||||
return 0
|
||||
|
||||
monkeypatch.setattr(monitor.subprocess, "check_call", fake_check_call)
|
||||
assert monitor.check_health("1.2.3.4") == "ONLINE"
|
||||
|
||||
|
||||
def test_check_health_reports_offline(monkeypatch):
|
||||
def fake_check_call(cmd, stdout=None):
|
||||
raise subprocess.CalledProcessError(returncode=1, cmd=cmd)
|
||||
|
||||
monkeypatch.setattr(monitor.subprocess, "check_call", fake_check_call)
|
||||
assert monitor.check_health("1.2.3.4") == "OFFLINE"
|
||||
|
||||
|
||||
def test_main_writes_status_file_and_prints(tmp_path, monkeypatch, capsys):
|
||||
monkeypatch.setattr(monitor, "STATUS_FILE", tmp_path / "failover_status.json")
|
||||
monkeypatch.setattr(monitor, "FLEET", {"ezra": "1.1.1.1", "bezalel": "2.2.2.2"})
|
||||
monkeypatch.setattr(monitor.time, "time", lambda: 1713148800.0)
|
||||
monkeypatch.setattr(
|
||||
monitor,
|
||||
"check_health",
|
||||
lambda host: "ONLINE" if host == "1.1.1.1" else "OFFLINE",
|
||||
)
|
||||
|
||||
monitor.main()
|
||||
|
||||
payload = json.loads(monitor.STATUS_FILE.read_text())
|
||||
assert payload == {
|
||||
"timestamp": 1713148800.0,
|
||||
"fleet": {"ezra": "ONLINE", "bezalel": "OFFLINE"},
|
||||
}
|
||||
|
||||
captured = capsys.readouterr()
|
||||
assert "ALLEGRO FAILOVER MONITOR" in captured.out.upper()
|
||||
assert "EZRA: ONLINE" in captured.out
|
||||
assert "BEZALEL: OFFLINE" in captured.out
|
||||
82
tests/test_fleet_milestones.py
Normal file
82
tests/test_fleet_milestones.py
Normal file
@@ -0,0 +1,82 @@
|
||||
import json
|
||||
from datetime import datetime
|
||||
|
||||
import pytest
|
||||
|
||||
from scripts import fleet_milestones as fm
|
||||
|
||||
|
||||
class FixedDateTime:
|
||||
@classmethod
|
||||
def utcnow(cls):
|
||||
return datetime(2026, 4, 15, 1, 2, 3)
|
||||
|
||||
|
||||
def test_trigger_persists_state_and_log(tmp_path, monkeypatch, capsys):
|
||||
state_file = tmp_path / "milestones.json"
|
||||
log_file = tmp_path / "fleet_milestones.log"
|
||||
|
||||
monkeypatch.setattr(fm, "STATE_FILE", state_file)
|
||||
monkeypatch.setattr(fm, "LOG_FILE", log_file)
|
||||
monkeypatch.setattr(fm, "datetime", FixedDateTime)
|
||||
|
||||
fm.trigger("health_check_first_run")
|
||||
|
||||
saved = json.loads(state_file.read_text())
|
||||
assert saved["health_check_first_run"] == {
|
||||
"triggered_at": "2026-04-15T01:02:03Z",
|
||||
"phase": 1,
|
||||
}
|
||||
|
||||
log_lines = log_file.read_text().strip().splitlines()
|
||||
assert len(log_lines) == 1
|
||||
assert "First automated health check ran" in log_lines[0]
|
||||
|
||||
captured = capsys.readouterr()
|
||||
assert "MILESTONE" in captured.out
|
||||
|
||||
|
||||
def test_trigger_dry_run_logs_without_persisting_state(tmp_path, monkeypatch):
|
||||
state_file = tmp_path / "milestones.json"
|
||||
log_file = tmp_path / "fleet_milestones.log"
|
||||
|
||||
monkeypatch.setattr(fm, "STATE_FILE", state_file)
|
||||
monkeypatch.setattr(fm, "LOG_FILE", log_file)
|
||||
monkeypatch.setattr(fm, "datetime", FixedDateTime)
|
||||
|
||||
fm.trigger("backup_first_success", dry_run=True)
|
||||
|
||||
assert not state_file.exists()
|
||||
assert "First automated backup completed" in log_file.read_text()
|
||||
|
||||
|
||||
def test_trigger_unknown_key_exits(monkeypatch):
|
||||
monkeypatch.setattr(fm, "datetime", FixedDateTime)
|
||||
with pytest.raises(SystemExit) as exc:
|
||||
fm.trigger("not-a-real-milestone")
|
||||
assert exc.value.code == 1
|
||||
|
||||
|
||||
def test_trigger_is_idempotent_once_recorded(tmp_path, monkeypatch, capsys):
|
||||
state_file = tmp_path / "milestones.json"
|
||||
log_file = tmp_path / "fleet_milestones.log"
|
||||
state_file.write_text(
|
||||
json.dumps(
|
||||
{
|
||||
"health_check_first_run": {
|
||||
"triggered_at": "2026-04-01T00:00:00Z",
|
||||
"phase": 1,
|
||||
}
|
||||
}
|
||||
)
|
||||
)
|
||||
|
||||
monkeypatch.setattr(fm, "STATE_FILE", state_file)
|
||||
monkeypatch.setattr(fm, "LOG_FILE", log_file)
|
||||
monkeypatch.setattr(fm, "datetime", FixedDateTime)
|
||||
|
||||
fm.trigger("health_check_first_run")
|
||||
|
||||
assert not log_file.exists()
|
||||
captured = capsys.readouterr()
|
||||
assert "already triggered" in captured.out
|
||||
Reference in New Issue
Block a user