Compare commits
2 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
aad1b0e652 | ||
|
|
e47e6506b4 |
299
GENOME.md
299
GENOME.md
@@ -1,209 +1,144 @@
|
||||
# GENOME.md — the-nexus
|
||||
# GENOME.md — Timmy_Foundation/timmy-home
|
||||
|
||||
Generated by `pipelines/codebase_genome.py`.
|
||||
|
||||
## Project Overview
|
||||
|
||||
`the-nexus` is a hybrid repo that combines three layers in one codebase:
|
||||
Timmy Foundation's home repository for development operations and configurations.
|
||||
|
||||
1. A browser-facing world shell rooted in `index.html`, `boot.js`, `bootstrap.mjs`, `app.js`, `style.css`, `portals.json`, `vision.json`, `manifest.json`, and `gofai_worker.js`
|
||||
2. A Python realtime bridge centered on `server.py` plus harness code under `nexus/`
|
||||
3. A memory / fleet / operator layer spanning `mempalace/`, `mcp_servers/`, `multi_user_bridge.py`, and supporting scripts
|
||||
- Text files indexed: 3133
|
||||
- Source and script files: 219
|
||||
- Test files: 73
|
||||
- Documentation files: 743
|
||||
|
||||
The repo is not a clean single-purpose frontend and not just a backend harness. It is a mixed world/runtime/ops repository where browser rendering, WebSocket telemetry, MCP-driven game harnesses, and fleet memory tooling coexist.
|
||||
|
||||
Grounded repo facts from this checkout:
|
||||
- Browser shell files exist at repo root: `index.html`, `app.js`, `style.css`, `manifest.json`, `gofai_worker.js`
|
||||
- Data/config files also live at repo root: `portals.json`, `vision.json`
|
||||
- Realtime bridge exists in `server.py`
|
||||
- Game harnesses exist in `nexus/morrowind_harness.py` and `nexus/bannerlord_harness.py`
|
||||
- Memory/fleet sync exists in `mempalace/tunnel_sync.py`
|
||||
- Desktop/game automation MCP servers exist in `mcp_servers/desktop_control_server.py` and `mcp_servers/steam_info_server.py`
|
||||
- Validation exists in `tests/test_browser_smoke.py`, `tests/test_portals_json.py`, `tests/test_index_html_integrity.py`, and `tests/test_repo_truth.py`
|
||||
|
||||
The current architecture is best understood as a sovereign world shell plus operator/game harness backend, with accumulated documentation drift from multiple restoration and migration efforts.
|
||||
|
||||
## Architecture Diagram
|
||||
## Architecture
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
browser[Index HTML Shell\nindex.html -> boot.js -> bootstrap.mjs -> app.js]
|
||||
assets[Root Assets\nstyle.css\nmanifest.json\ngofai_worker.js]
|
||||
data[World Data\nportals.json\nvision.json]
|
||||
ws[Realtime Bridge\nserver.py\nWebSocket broadcast hub]
|
||||
gofai[In-browser GOFAI\nSymbolicEngine\nNeuroSymbolicBridge\nsetupGOFAI/updateGOFAI]
|
||||
harnesses[Python Harnesses\nnexus/morrowind_harness.py\nnexus/bannerlord_harness.py]
|
||||
mcp[MCP Adapters\nmcp_servers/desktop_control_server.py\nmcp_servers/steam_info_server.py]
|
||||
memory[Memory + Fleet\nmempalace/tunnel_sync.py\nmempalace.js]
|
||||
bridge[Operator / MUD Bridge\nmulti_user_bridge.py\ncommands/timmy_commands.py]
|
||||
tests[Verification\ntests/test_browser_smoke.py\ntests/test_portals_json.py\ntests/test_repo_truth.py]
|
||||
docs[Contracts + Drift Docs\nBROWSER_CONTRACT.md\nREADME.md\nCLAUDE.md\nINVESTIGATION_ISSUE_1145.md]
|
||||
|
||||
browser --> assets
|
||||
browser --> data
|
||||
browser --> gofai
|
||||
browser --> ws
|
||||
harnesses --> mcp
|
||||
harnesses --> ws
|
||||
bridge --> ws
|
||||
memory --> ws
|
||||
tests --> browser
|
||||
tests --> data
|
||||
tests --> docs
|
||||
docs --> browser
|
||||
repo_root["repo"]
|
||||
angband["angband"]
|
||||
ansible["ansible"]
|
||||
briefings["briefings"]
|
||||
codebase_genome["codebase_genome"]
|
||||
config["config"]
|
||||
configs["configs"]
|
||||
conftest["conftest"]
|
||||
dns_records["dns-records"]
|
||||
evennia["evennia"]
|
||||
evennia_tools["evennia_tools"]
|
||||
repo_root --> angband
|
||||
repo_root --> ansible
|
||||
repo_root --> briefings
|
||||
repo_root --> codebase_genome
|
||||
repo_root --> config
|
||||
repo_root --> configs
|
||||
```
|
||||
|
||||
## Entry Points and Data Flow
|
||||
## Entry Points
|
||||
|
||||
### Primary entry points
|
||||
- `codebase_genome.py` — python main guard (`python3 codebase_genome.py`)
|
||||
- `gemini-fallback-setup.sh` — operational script (`bash gemini-fallback-setup.sh`)
|
||||
- `morrowind/hud.sh` — operational script (`bash morrowind/hud.sh`)
|
||||
- `pipelines/codebase_genome.py` — python main guard (`python3 pipelines/codebase_genome.py`)
|
||||
- `scripts/agent_pr_gate.py` — operational script (`python3 scripts/agent_pr_gate.py`)
|
||||
- `scripts/auto_restart_agent.sh` — operational script (`bash scripts/auto_restart_agent.sh`)
|
||||
- `scripts/autonomous_issue_creator.py` — operational script (`python3 scripts/autonomous_issue_creator.py`)
|
||||
- `scripts/backlog_cleanup.py` — operational script (`python3 scripts/backlog_cleanup.py`)
|
||||
- `scripts/backlog_triage.py` — operational script (`python3 scripts/backlog_triage.py`)
|
||||
- `scripts/backlog_triage_cron.sh` — operational script (`bash scripts/backlog_triage_cron.sh`)
|
||||
- `scripts/backup_pipeline.sh` — operational script (`bash scripts/backup_pipeline.sh`)
|
||||
- `scripts/bezalel_gemma4_vps.py` — operational script (`python3 scripts/bezalel_gemma4_vps.py`)
|
||||
|
||||
- `index.html` — root browser entry point
|
||||
- `boot.js` — startup selector; `tests/boot.test.js` shows it chooses file-mode vs HTTP/module-mode and injects `bootstrap.mjs` when served over HTTP
|
||||
- `bootstrap.mjs` — module bootstrap for the browser shell
|
||||
- `app.js` — main browser runtime; owns world state, GOFAI wiring, metrics polling, and portal/UI logic
|
||||
- `server.py` — WebSocket broadcast bridge on `ws://0.0.0.0:8765`
|
||||
- `nexus/morrowind_harness.py` — GamePortal/MCP harness for OpenMW Morrowind
|
||||
- `nexus/bannerlord_harness.py` — GamePortal/MCP harness for Bannerlord
|
||||
- `mempalace/tunnel_sync.py` — pulls remote fleet closets into the local palace over HTTP
|
||||
- `multi_user_bridge.py` — HTTP bridge for multi-user chat/session integration
|
||||
- `mcp_servers/desktop_control_server.py` — stdio MCP server exposing screenshots/mouse/keyboard control
|
||||
## Data Flow
|
||||
|
||||
### Data flow
|
||||
|
||||
1. Browser startup begins at `index.html`
|
||||
2. `boot.js` decides whether the page is being served correctly; in HTTP mode it injects `bootstrap.mjs`
|
||||
3. `bootstrap.mjs` hands off to `app.js`
|
||||
4. `app.js` loads world configuration from `portals.json` and `vision.json`
|
||||
5. `app.js` constructs the Three.js scene and in-browser reasoning components, including `SymbolicEngine`, `NeuroSymbolicBridge`, `setupGOFAI()`, and `updateGOFAI()`
|
||||
6. Browser state and external runtimes connect through `server.py`, which broadcasts messages between connected clients
|
||||
7. Python harnesses (`nexus/morrowind_harness.py`, `nexus/bannerlord_harness.py`) spawn MCP subprocesses for desktop control / Steam metadata, capture state, execute actions, and feed telemetry into the Nexus bridge
|
||||
8. Memory/fleet tools like `mempalace/tunnel_sync.py` import remote palace data into local closets, extending what the operator/runtime layers can inspect
|
||||
9. Tests validate both the static browser contract and the higher-level repo-truth/memory contracts
|
||||
|
||||
### Important repo-specific runtime facts
|
||||
|
||||
- `portals.json` is a JSON array of portal/world/operator entries; examples in this checkout include `morrowind`, `bannerlord`, `workshop`, `archive`, `chapel`, and `courtyard`
|
||||
- `server.py` is a plain broadcast hub: clients send messages, the server forwards them to other connected clients
|
||||
- `nexus/morrowind_harness.py` and `nexus/bannerlord_harness.py` both implement a GamePortal pattern with MCP subprocess clients over stdio and WebSocket telemetry uplink
|
||||
- `mempalace/tunnel_sync.py` is not speculative; it is a real client that discovers remote wings, searches remote rooms, and writes `.closet.json` payloads locally
|
||||
1. Operators enter through `codebase_genome.py`, `gemini-fallback-setup.sh`, `morrowind/hud.sh`.
|
||||
2. Core logic fans into top-level components: `angband`, `ansible`, `briefings`, `codebase_genome`, `config`, `configs`.
|
||||
3. Validation is incomplete around `wizards/allegro/home/skills/red-teaming/godmode/scripts/auto_jailbreak.py`, `timmy-local/cache/agent_cache.py`, `wizards/allegro/home/skills/red-teaming/godmode/scripts/parseltongue.py`, so changes there carry regression risk.
|
||||
4. Final artifacts land as repository files, docs, or runtime side effects depending on the selected entry point.
|
||||
|
||||
## Key Abstractions
|
||||
|
||||
### Browser runtime
|
||||
|
||||
- `app.js`
|
||||
- Defines in-browser reasoning/state machinery, including `class SymbolicEngine`, `class NeuroSymbolicBridge`, `setupGOFAI()`, and `updateGOFAI()`
|
||||
- Couples rendering, local symbolic reasoning, metrics polling, and portal/UI logic in one very large root module
|
||||
- `BROWSER_CONTRACT.md`
|
||||
- Acts like an executable architecture contract for the browser surface
|
||||
- Declares required files, DOM IDs, Three.js expectations, provenance rules, and WebSocket expectations
|
||||
|
||||
### Realtime bridge
|
||||
|
||||
- `server.py`
|
||||
- Single hub abstraction: a WebSocket broadcast server maintaining a `clients` set and forwarding messages from one client to the others
|
||||
- This is the seam between browser shell, harnesses, and external telemetry producers
|
||||
|
||||
### GamePortal harness layer
|
||||
|
||||
- `nexus/morrowind_harness.py`
|
||||
- `nexus/bannerlord_harness.py`
|
||||
- Both define MCP client wrappers, `GameState` / `ActionResult`-style data classes, and an Observe-Decide-Act telemetry loop
|
||||
- The harnesses are symmetric enough to be understood as reusable portal adapters with game-specific context injected on top
|
||||
|
||||
### Memory / fleet layer
|
||||
|
||||
- `mempalace/tunnel_sync.py`
|
||||
- Encodes the fleet-memory sync client contract: discover wings, pull broad room queries, write closet files, support dry-run
|
||||
- `mempalace.js`
|
||||
- Minimal browser/Electron bridge to MemPalace commands via `window.electronAPI.execPython(...)`
|
||||
- Important because it shows a second memory integration surface distinct from the Python fleet sync path
|
||||
|
||||
### Operator / interaction bridge
|
||||
|
||||
- `multi_user_bridge.py`
|
||||
- `commands/timmy_commands.py`
|
||||
- These bridge user-facing conversations or MUD/Evennia interactions back into Timmy/Nexus services
|
||||
- `codebase_genome.py` — classes `FunctionInfo`:19; functions `extract_functions()`:58, `generate_test()`:116, `scan_repo()`:191, `find_existing_tests()`:209, `main()`:231
|
||||
- `evennia/timmy_world/game.py` — classes `World`:91, `ActionSystem`:421, `TimmyAI`:539, `NPCAI`:550; functions `get_narrative_phase()`:55, `get_phase_transition_event()`:65
|
||||
- `evennia/timmy_world/world/game.py` — classes `World`:19, `ActionSystem`:326, `TimmyAI`:444, `NPCAI`:455; functions none detected
|
||||
- `timmy-world/game.py` — classes `World`:19, `ActionSystem`:349, `TimmyAI`:467, `NPCAI`:478; functions none detected
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/auto_jailbreak.py` — classes none detected; functions none detected
|
||||
- `uniwizard/self_grader.py` — classes `SessionGrade`:23, `WeeklyReport`:55, `SelfGrader`:74; functions `main()`:713
|
||||
- `uni-wizard/v3/intelligence_engine.py` — classes `ExecutionPattern`:27, `ModelPerformance`:44, `AdaptationEvent`:58, `PatternDatabase`:69; functions none detected
|
||||
- `scripts/know_thy_father/crossref_audit.py` — classes `ThemeCategory`:30, `Principle`:160, `MeaningKernel`:169, `CrossRefFinding`:178; functions `extract_themes_from_text()`:192, `parse_soul_md()`:206, `parse_kernels()`:264, `cross_reference()`:296, `generate_report()`:440, `main()`:561
|
||||
|
||||
## API Surface
|
||||
|
||||
### Browser / static surface
|
||||
- CLI: `python3 codebase_genome.py` — python main guard (`codebase_genome.py`)
|
||||
- CLI: `bash gemini-fallback-setup.sh` — operational script (`gemini-fallback-setup.sh`)
|
||||
- CLI: `bash morrowind/hud.sh` — operational script (`morrowind/hud.sh`)
|
||||
- CLI: `python3 pipelines/codebase_genome.py` — python main guard (`pipelines/codebase_genome.py`)
|
||||
- CLI: `python3 scripts/agent_pr_gate.py` — operational script (`scripts/agent_pr_gate.py`)
|
||||
- CLI: `bash scripts/auto_restart_agent.sh` — operational script (`scripts/auto_restart_agent.sh`)
|
||||
- CLI: `python3 scripts/autonomous_issue_creator.py` — operational script (`scripts/autonomous_issue_creator.py`)
|
||||
- CLI: `python3 scripts/backlog_cleanup.py` — operational script (`scripts/backlog_cleanup.py`)
|
||||
- Python: `extract_functions()` from `codebase_genome.py:58`
|
||||
- Python: `generate_test()` from `codebase_genome.py:116`
|
||||
- Python: `scan_repo()` from `codebase_genome.py:191`
|
||||
- Python: `find_existing_tests()` from `codebase_genome.py:209`
|
||||
- Python: `main()` from `codebase_genome.py:231`
|
||||
- Python: `get_narrative_phase()` from `evennia/timmy_world/game.py:55`
|
||||
|
||||
- `index.html` served over HTTP
|
||||
- `boot.js` exports `bootPage()`; verified by `node --test tests/boot.test.js`
|
||||
- Data APIs are file-based inside the repo: `portals.json`, `vision.json`, `manifest.json`
|
||||
## Test Coverage Report
|
||||
|
||||
### Network/runtime surface
|
||||
- Source and script files inspected: 219
|
||||
- Test files inspected: 73
|
||||
- Coverage gaps:
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/auto_jailbreak.py` — no matching test reference detected
|
||||
- `timmy-local/cache/agent_cache.py` — no matching test reference detected
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/parseltongue.py` — no matching test reference detected
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/godmode_race.py` — no matching test reference detected
|
||||
- `skills/productivity/google-workspace/scripts/google_api.py` — no matching test reference detected
|
||||
- `wizards/allegro/home/skills/productivity/google-workspace/scripts/google_api.py` — no matching test reference detected
|
||||
- `morrowind/pilot.py` — no matching test reference detected
|
||||
- `skills/research/domain-intel/scripts/domain_intel.py` — no matching test reference detected
|
||||
- `wizards/allegro/home/skills/research/domain-intel/scripts/domain_intel.py` — no matching test reference detected
|
||||
- `timmy-local/scripts/ingest.py` — no matching test reference detected
|
||||
- `uni-wizard/scripts/generate_scorecard.py` — no matching test reference detected
|
||||
- `morrowind/local_brain.py` — no matching test reference detected
|
||||
|
||||
- `python3 server.py`
|
||||
- Starts the WebSocket bridge on port `8765`
|
||||
- `python3 l402_server.py`
|
||||
- Local HTTP microservice for cost-estimate style responses
|
||||
- `python3 multi_user_bridge.py`
|
||||
- Multi-user HTTP/chat bridge
|
||||
## Security Audit Findings
|
||||
|
||||
### Harness / operator CLI surfaces
|
||||
- [medium] `briefings/briefing_20260325.json:37` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `"gitea_error": "Gitea 404: {\"errors\":null,\"message\":\"not found\",\"url\":\"http://143.198.27.163:3000/api/swagger\"}\n [http://143.198.27.163:3000/api/v1/repos/Timmy_Foundation/sovereign-orchestration/issues?state=open&type=issues&sort=created&direction=desc&limit=1&page=1]",`
|
||||
- [medium] `briefings/briefing_20260328.json:11` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `"provider_base_url": "http://localhost:8081/v1",`
|
||||
- [medium] `briefings/briefing_20260329.json:11` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `"provider_base_url": "http://localhost:8081/v1",`
|
||||
- [medium] `config.yaml:37` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `summary_base_url: http://localhost:11434/v1`
|
||||
- [medium] `config.yaml:47` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:52` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:57` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:62` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:67` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:77` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:82` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:174` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: http://localhost:11434/v1`
|
||||
|
||||
- `python3 nexus/morrowind_harness.py`
|
||||
- `python3 nexus/bannerlord_harness.py`
|
||||
- `python3 mempalace/tunnel_sync.py --peer <url> [--dry-run] [--n N]`
|
||||
- `python3 mcp_servers/desktop_control_server.py`
|
||||
- `python3 mcp_servers/steam_info_server.py`
|
||||
## Dead Code Candidates
|
||||
|
||||
### Validation surface
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/auto_jailbreak.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `timmy-local/cache/agent_cache.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/parseltongue.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/godmode_race.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `skills/productivity/google-workspace/scripts/google_api.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `wizards/allegro/home/skills/productivity/google-workspace/scripts/google_api.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `morrowind/pilot.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `skills/research/domain-intel/scripts/domain_intel.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `wizards/allegro/home/skills/research/domain-intel/scripts/domain_intel.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `timmy-local/scripts/ingest.py` — not imported by indexed Python modules and not referenced by tests
|
||||
|
||||
- `python3 -m pytest tests/test_portals_json.py tests/test_index_html_integrity.py tests/test_repo_truth.py -q`
|
||||
- `node --test tests/boot.test.js`
|
||||
- `python3 -m py_compile server.py nexus/morrowind_harness.py nexus/bannerlord_harness.py mempalace/tunnel_sync.py mcp_servers/desktop_control_server.py`
|
||||
- `tests/test_browser_smoke.py` defines the higher-cost Playwright smoke contract for the world shell
|
||||
## Performance Bottleneck Analysis
|
||||
|
||||
## Test Coverage Gaps
|
||||
|
||||
Strongly covered in this checkout:
|
||||
- `tests/test_portals_json.py` validates `portals.json`
|
||||
- `tests/test_index_html_integrity.py` checks merge-marker/DOM-integrity regressions in `index.html`
|
||||
- `tests/boot.test.js` verifies `boot.js` startup behavior
|
||||
- `tests/test_repo_truth.py` validates the repo-truth documents
|
||||
- Multiple `tests/test_mempalace_*.py` files cover the palace layer
|
||||
- `tests/test_bannerlord_harness.py` exists for the Bannerlord harness
|
||||
|
||||
Notable gaps or weak seams:
|
||||
- `nexus/morrowind_harness.py` is large and operationally critical, but the generated baseline still flags it as a gap relative to its size/complexity
|
||||
- `mcp_servers/desktop_control_server.py` exposes high-power automation but has no obvious dedicated test file in the root `tests/` suite
|
||||
- `app.js` is the dominant browser runtime file and mixes rendering, GOFAI, metrics, and integration logic in one place; browser smoke exists, but there is limited unit-level decomposition around those subsystems
|
||||
- `mempalace.js` appears minimally bridged and stale relative to the richer Python MemPalace layer
|
||||
- `multi_user_bridge.py` is a large integration surface and should be treated as high regression risk even though it is central to operator/chat flow
|
||||
|
||||
## Security Considerations
|
||||
|
||||
- `server.py` binds `HOST = "0.0.0.0"`, exposing the broadcast bridge beyond localhost unless network controls limit it
|
||||
- The WebSocket bridge is a broadcast hub without visible authentication in `server.py`; connected clients are trusted to send messages into the bus
|
||||
- `mcp_servers/desktop_control_server.py` exposes mouse/keyboard/screenshot control through a stdio MCP server. In any non-local or poorly isolated runtime, this is a privileged automation surface
|
||||
- `app.js` contains hardcoded local/network endpoints such as `http://localhost:${L402_PORT}/api/cost-estimate` and `http://localhost:8082/metrics`; these are convenient for local development but create environment drift and deployment assumptions
|
||||
- `app.js` also embeds explicit endpoint/status references like `ws://143.198.27.163:8765`, which is operationally brittle and the kind of hardcoded location data that drifts across environments
|
||||
- `mempalace.js` shells out through `window.electronAPI.execPython(...)`; this is powerful and useful, but it is a clear trust boundary between UI and host execution
|
||||
- `INVESTIGATION_ISSUE_1145.md` documents an earlier integrity hazard: agents writing to `public/nexus/` instead of canonical root paths. That path confusion is both an operational and security concern because it makes provenance harder to reason about
|
||||
|
||||
## Runtime Truth and Docs Drift
|
||||
|
||||
The most important architecture finding in this repo is not a class or subsystem. It is a truth mismatch.
|
||||
|
||||
- README.md says current `main` does not ship a browser 3D world
|
||||
- CLAUDE.md declares root `app.js` and `index.html` as canonical frontend paths
|
||||
- tests and browser contract now assume the root frontend exists
|
||||
|
||||
All three statements are simultaneously present in this checkout.
|
||||
|
||||
Grounded evidence:
|
||||
- `README.md` still says the repo does not contain an active root frontend such as `index.html`, `app.js`, or `style.css`
|
||||
- the current checkout does contain `index.html`, `app.js`, `style.css`, `manifest.json`, and `gofai_worker.js`
|
||||
- `BROWSER_CONTRACT.md` explicitly treats those root files as required browser assets
|
||||
- `tests/test_browser_smoke.py` serves those exact files and validates DOM/WebGL contracts against them
|
||||
- `tests/test_index_html_integrity.py` assumes `index.html` is canonical and production-relevant
|
||||
- `CLAUDE.md` says frontend code lives at repo root and explicitly warns against `public/nexus/`
|
||||
- `INVESTIGATION_ISSUE_1145.md` explains why `public/nexus/` is a bad/corrupt duplicate path and confirms the real classical AI code lives in root `app.js`
|
||||
|
||||
The honest conclusion:
|
||||
- The repo contains a partially restored or actively re-materialized browser surface
|
||||
- The docs are preserving an older migration truth while the runtime files and smoke contracts describe a newer present-tense truth
|
||||
- Any future work in `the-nexus` must choose one truth and align `README.md`, `CLAUDE.md`, smoke tests, and file layout around it
|
||||
|
||||
That drift is itself a critical architectural fact and should be treated as first-order design debt, not a side note.
|
||||
- `angband/mcp_server.py` — large module (353 lines) likely hides multiple responsibilities
|
||||
- `evennia/timmy_world/game.py` — large module (1541 lines) likely hides multiple responsibilities
|
||||
- `evennia/timmy_world/world/game.py` — large module (1345 lines) likely hides multiple responsibilities
|
||||
- `morrowind/mcp_server.py` — large module (451 lines) likely hides multiple responsibilities
|
||||
- `morrowind/pilot.py` — large module (459 lines) likely hides multiple responsibilities
|
||||
- `pipelines/codebase_genome.py` — large module (557 lines) likely hides multiple responsibilities
|
||||
- `scripts/know_thy_father/crossref_audit.py` — large module (657 lines) likely hides multiple responsibilities
|
||||
- `scripts/know_thy_father/index_media.py` — large module (405 lines) likely hides multiple responsibilities
|
||||
- `scripts/know_thy_father/synthesize_kernels.py` — large module (416 lines) likely hides multiple responsibilities
|
||||
- `scripts/predictive_resource_allocator.py` — large module (410 lines) likely hides multiple responsibilities
|
||||
|
||||
@@ -8,6 +8,7 @@ This pipeline gives Timmy a repeatable way to generate a deterministic `GENOME.m
|
||||
|
||||
- `pipelines/codebase_genome.py` — static analyzer that writes `GENOME.md`
|
||||
- `pipelines/codebase-genome.py` — thin CLI wrapper matching the expected pipeline-style entrypoint
|
||||
- `templates/GENOME-template.md` — reusable review scaffold with the exact sections the generator emits
|
||||
- `scripts/codebase_genome_nightly.py` — org-aware nightly runner that selects the next repo, updates a local checkout, and writes the genome artifact
|
||||
- `GENOME.md` — generated analysis for `timmy-home` itself
|
||||
|
||||
@@ -40,6 +41,14 @@ The hyphenated wrapper also works:
|
||||
python3 pipelines/codebase-genome.py --repo-root /path/to/repo --repo Timmy_Foundation/some-repo
|
||||
```
|
||||
|
||||
If an agent or human wants to review or hand-edit the artifact before publishing it, start from:
|
||||
|
||||
```text
|
||||
templates/GENOME-template.md
|
||||
```
|
||||
|
||||
The template uses the same section names as the generator output, so issue-specific verification can lock the structure without depending on one repo's exact contents.
|
||||
|
||||
## Nightly org rotation
|
||||
|
||||
Dry-run the next selection:
|
||||
|
||||
@@ -1,14 +0,0 @@
|
||||
fleet_name: timmy-phase-3-config
|
||||
|
||||
targets:
|
||||
- host: ezra
|
||||
config_root: /root/wizards/ezra/home/.hermes
|
||||
files:
|
||||
- config.yaml
|
||||
- dispatch/rules.json
|
||||
|
||||
- host: bezalel
|
||||
config_root: /root/wizards/bezalel/home/.hermes
|
||||
files:
|
||||
- config.yaml
|
||||
- dispatch/rules.json
|
||||
@@ -1,292 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Plan atomic fleet config sync releases.
|
||||
|
||||
Refs: timmy-home #550
|
||||
|
||||
Phase-3 orchestration slice:
|
||||
- define a shared config-sync manifest for fleet hosts
|
||||
- fingerprint the exact config payload into one release id
|
||||
- generate per-host staging paths and atomic symlink-swap promotion metadata
|
||||
- stay dry-run by default so rollout planning is safe to verify locally
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import hashlib
|
||||
import json
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
import yaml
|
||||
|
||||
|
||||
DEFAULT_INVENTORY_FILE = Path(__file__).resolve().parents[1] / "ansible" / "inventory" / "hosts.ini"
|
||||
|
||||
|
||||
def load_inventory_hosts(path: str | Path) -> dict[str, dict[str, str]]:
|
||||
hosts: dict[str, dict[str, str]] = {}
|
||||
section = None
|
||||
for raw_line in Path(path).read_text(encoding="utf-8").splitlines():
|
||||
line = raw_line.strip()
|
||||
if not line or line.startswith("#") or line.startswith(";"):
|
||||
continue
|
||||
if line.startswith("[") and line.endswith("]"):
|
||||
section = line[1:-1].strip().lower()
|
||||
continue
|
||||
if section != "fleet":
|
||||
continue
|
||||
|
||||
parts = line.split()
|
||||
host = parts[0]
|
||||
metadata = {"host": host}
|
||||
for token in parts[1:]:
|
||||
if "=" not in token:
|
||||
continue
|
||||
key, value = token.split("=", 1)
|
||||
metadata[key] = value
|
||||
hosts[host] = metadata
|
||||
|
||||
if not hosts:
|
||||
raise ValueError("inventory defines no [fleet] hosts")
|
||||
return hosts
|
||||
|
||||
|
||||
|
||||
def load_manifest(path: str | Path) -> dict[str, Any]:
|
||||
data = yaml.safe_load(Path(path).read_text(encoding="utf-8")) or {}
|
||||
if not isinstance(data, dict):
|
||||
raise ValueError("manifest must contain a YAML object")
|
||||
data.setdefault("fleet_name", "timmy-fleet-config")
|
||||
data.setdefault("targets", [])
|
||||
if not isinstance(data["targets"], list):
|
||||
raise ValueError("targets must be a list")
|
||||
return data
|
||||
|
||||
|
||||
|
||||
def _normalize_relative_path(value: str) -> str:
|
||||
path = Path(value)
|
||||
if path.is_absolute():
|
||||
raise ValueError(f"sync file path must be relative: {value}")
|
||||
if any(part == ".." for part in path.parts):
|
||||
raise ValueError(f"sync file path may not escape source root: {value}")
|
||||
normalized = path.as_posix()
|
||||
if normalized in {"", "."}:
|
||||
raise ValueError("sync file path may not be empty")
|
||||
return normalized
|
||||
|
||||
|
||||
|
||||
def validate_manifest(manifest: dict[str, Any], inventory_hosts: dict[str, dict[str, str]]) -> None:
|
||||
targets = manifest.get("targets", [])
|
||||
if not targets:
|
||||
raise ValueError("manifest must define at least one sync target")
|
||||
|
||||
seen_hosts: set[str] = set()
|
||||
for target in targets:
|
||||
if not isinstance(target, dict):
|
||||
raise ValueError("each target must be a mapping")
|
||||
|
||||
host = str(target.get("host", "")).strip()
|
||||
if not host:
|
||||
raise ValueError("each target must declare a host")
|
||||
if host in seen_hosts:
|
||||
raise ValueError(f"duplicate target host: {host}")
|
||||
if host not in inventory_hosts:
|
||||
raise ValueError(f"unknown inventory host: {host}")
|
||||
seen_hosts.add(host)
|
||||
|
||||
config_root = str(target.get("config_root", "")).strip()
|
||||
if not config_root:
|
||||
raise ValueError(f"target {host} missing config_root")
|
||||
|
||||
files = target.get("files")
|
||||
if not isinstance(files, list) or not files:
|
||||
raise ValueError(f"target {host} must declare at least one file")
|
||||
|
||||
normalized: list[str] = []
|
||||
for entry in files:
|
||||
normalized.append(_normalize_relative_path(str(entry)))
|
||||
if len(set(normalized)) != len(normalized):
|
||||
raise ValueError(f"target {host} declares duplicate file paths")
|
||||
|
||||
|
||||
|
||||
def _hash_file(path: Path) -> str:
|
||||
return hashlib.sha256(path.read_bytes()).hexdigest()
|
||||
|
||||
|
||||
|
||||
def _collect_target_files(source_root: Path, rel_paths: list[str]) -> list[dict[str, Any]]:
|
||||
items: list[dict[str, Any]] = []
|
||||
for rel_path in sorted(_normalize_relative_path(path) for path in rel_paths):
|
||||
source_path = source_root / rel_path
|
||||
if not source_path.exists():
|
||||
raise FileNotFoundError(f"missing source file: {rel_path}")
|
||||
if not source_path.is_file():
|
||||
raise ValueError(f"sync source must be a file: {rel_path}")
|
||||
items.append(
|
||||
{
|
||||
"relative_path": rel_path,
|
||||
"source": str(source_path),
|
||||
"sha256": _hash_file(source_path),
|
||||
"size": source_path.stat().st_size,
|
||||
}
|
||||
)
|
||||
return items
|
||||
|
||||
|
||||
|
||||
def compute_release_id(target_payloads: list[dict[str, Any]]) -> str:
|
||||
digest = hashlib.sha256()
|
||||
for target in sorted(target_payloads, key=lambda item: item["host"]):
|
||||
digest.update(target["host"].encode("utf-8"))
|
||||
digest.update(b"\0")
|
||||
for file_item in sorted(target["files"], key=lambda item: item["relative_path"]):
|
||||
digest.update(file_item["relative_path"].encode("utf-8"))
|
||||
digest.update(b"\0")
|
||||
digest.update(file_item["sha256"].encode("utf-8"))
|
||||
digest.update(b"\0")
|
||||
digest.update(str(file_item["size"]).encode("utf-8"))
|
||||
digest.update(b"\0")
|
||||
return digest.hexdigest()[:12]
|
||||
|
||||
|
||||
|
||||
def build_rollout_plan(
|
||||
manifest: dict[str, Any],
|
||||
inventory_hosts: dict[str, dict[str, str]],
|
||||
*,
|
||||
source_root: str | Path,
|
||||
) -> dict[str, Any]:
|
||||
validate_manifest(manifest, inventory_hosts)
|
||||
|
||||
source_root = Path(source_root)
|
||||
if not source_root.exists():
|
||||
raise FileNotFoundError(f"source root not found: {source_root}")
|
||||
if not source_root.is_dir():
|
||||
raise ValueError(f"source root must be a directory: {source_root}")
|
||||
|
||||
staged_targets: list[dict[str, Any]] = []
|
||||
for target in sorted(manifest["targets"], key=lambda item: item["host"]):
|
||||
host = target["host"]
|
||||
files = _collect_target_files(source_root, target["files"])
|
||||
staged_targets.append(
|
||||
{
|
||||
"host": host,
|
||||
"inventory": inventory_hosts[host],
|
||||
"config_root": str(target["config_root"]),
|
||||
"files": files,
|
||||
}
|
||||
)
|
||||
|
||||
release_id = compute_release_id(staged_targets)
|
||||
total_bytes = 0
|
||||
file_count = 0
|
||||
rendered_targets: list[dict[str, Any]] = []
|
||||
for target in staged_targets:
|
||||
config_root = target["config_root"].rstrip("/")
|
||||
stage_root = f"{config_root}/.releases/{release_id}"
|
||||
live_symlink = f"{config_root}/current"
|
||||
previous_symlink = f"{config_root}/previous"
|
||||
file_count += len(target["files"])
|
||||
total_bytes += sum(item["size"] for item in target["files"])
|
||||
rendered_targets.append(
|
||||
{
|
||||
"host": target["host"],
|
||||
"ansible_host": target["inventory"].get("ansible_host", ""),
|
||||
"ansible_user": target["inventory"].get("ansible_user", ""),
|
||||
"config_root": config_root,
|
||||
"stage_root": stage_root,
|
||||
"live_symlink": live_symlink,
|
||||
"previous_symlink": previous_symlink,
|
||||
"files": target["files"],
|
||||
"promote": {
|
||||
"mode": "symlink_swap",
|
||||
"release_id": release_id,
|
||||
"from": stage_root,
|
||||
"to": live_symlink,
|
||||
"backup_link": previous_symlink,
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
return {
|
||||
"fleet_name": manifest.get("fleet_name", "timmy-fleet-config"),
|
||||
"source_root": str(source_root),
|
||||
"release_id": release_id,
|
||||
"target_count": len(rendered_targets),
|
||||
"file_count": file_count,
|
||||
"total_bytes": total_bytes,
|
||||
"targets": rendered_targets,
|
||||
}
|
||||
|
||||
|
||||
|
||||
def render_markdown(plan: dict[str, Any]) -> str:
|
||||
lines = [
|
||||
"# Fleet Config Sync Plan",
|
||||
"",
|
||||
f"Fleet: {plan['fleet_name']}",
|
||||
f"Release ID: `{plan['release_id']}`",
|
||||
f"Source root: `{plan['source_root']}`",
|
||||
f"Target count: {plan['target_count']}",
|
||||
f"File count: {plan['file_count']}",
|
||||
f"Total bytes: {plan['total_bytes']}",
|
||||
"",
|
||||
"Atomic promote via symlink swap keeps every host on one named release boundary.",
|
||||
"",
|
||||
"| Host | Address | Stage root | Live symlink | Files |",
|
||||
"|---|---|---|---|---:|",
|
||||
]
|
||||
|
||||
for target in plan["targets"]:
|
||||
lines.append(
|
||||
f"| {target['host']} | {target['ansible_host'] or 'n/a'} | `{target['stage_root']}` | `{target['live_symlink']}` | {len(target['files'])} |"
|
||||
)
|
||||
|
||||
lines.extend(["", "## Target file manifests", ""])
|
||||
for target in plan["targets"]:
|
||||
lines.extend(
|
||||
[
|
||||
f"### {target['host']}",
|
||||
"",
|
||||
f"- Promote: `{target['promote']['from']}` -> `{target['promote']['to']}`",
|
||||
f"- Backup link: `{target['promote']['backup_link']}`",
|
||||
"",
|
||||
"| Relative path | Bytes | SHA256 |",
|
||||
"|---|---:|---|",
|
||||
]
|
||||
)
|
||||
for file_item in target["files"]:
|
||||
lines.append(
|
||||
f"| `{file_item['relative_path']}` | {file_item['size']} | `{file_item['sha256'][:16]}…` |"
|
||||
)
|
||||
lines.append("")
|
||||
|
||||
return "\n".join(lines).rstrip() + "\n"
|
||||
|
||||
|
||||
|
||||
def main() -> int:
|
||||
parser = argparse.ArgumentParser(description="Plan a dry-run atomic config sync release across fleet hosts")
|
||||
parser.add_argument("manifest", help="Path to fleet config sync manifest YAML")
|
||||
parser.add_argument("--inventory", default=str(DEFAULT_INVENTORY_FILE), help="Path to Ansible fleet inventory")
|
||||
parser.add_argument("--source-root", default=".", help="Local source root containing files listed in the manifest")
|
||||
parser.add_argument("--markdown", action="store_true", help="Render markdown instead of JSON")
|
||||
args = parser.parse_args()
|
||||
|
||||
inventory = load_inventory_hosts(args.inventory)
|
||||
manifest = load_manifest(args.manifest)
|
||||
plan = build_rollout_plan(manifest, inventory, source_root=args.source_root)
|
||||
|
||||
if args.markdown:
|
||||
print(render_markdown(plan))
|
||||
else:
|
||||
print(json.dumps(plan, indent=2))
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
raise SystemExit(main())
|
||||
67
templates/GENOME-template.md
Normal file
67
templates/GENOME-template.md
Normal file
@@ -0,0 +1,67 @@
|
||||
# GENOME.md — [org/repo]
|
||||
|
||||
Generated by `pipelines/codebase_genome.py` or used as a manual review scaffold when a human is curating the final artifact.
|
||||
|
||||
## Project Overview
|
||||
|
||||
[One paragraph: what the repo does, why it exists, and what outcome it creates.]
|
||||
|
||||
- Text files indexed: [count]
|
||||
- Source and script files: [count]
|
||||
- Test files: [count]
|
||||
- Documentation files: [count]
|
||||
|
||||
## Architecture
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
repo_root["repo"] --> component_a["component-a"]
|
||||
repo_root --> component_b["component-b"]
|
||||
component_a --> component_b
|
||||
```
|
||||
|
||||
## Entry Points
|
||||
|
||||
- `[path/to/entrypoint]` — [why it matters] (`python3 path/to/entrypoint.py`)
|
||||
- `[path/to/other-entrypoint]` — [why it matters] (`bash path/to/script.sh`)
|
||||
|
||||
## Data Flow
|
||||
|
||||
1. [How operators or callers enter the system.]
|
||||
2. [Which modules or directories fan out from the entrypoint.]
|
||||
3. [Where validation or test gaps create risk.]
|
||||
4. [What artifact, state change, or runtime side effect is produced.]
|
||||
|
||||
## Key Abstractions
|
||||
|
||||
- `[module.py]` — classes `[ClassName]:line`; functions `[function_name()]:line`
|
||||
- `[another_module.py]` — classes `[AnotherClass]:line`; functions `[run()]:line`
|
||||
|
||||
## API Surface
|
||||
|
||||
- CLI: `python3 [entrypoint] --help` — [what it exposes]
|
||||
- Python: `[public_function]()` from `[module.py:line]`
|
||||
- HTTP/WebSocket/other: `[surface]` — [contract summary]
|
||||
|
||||
## Test Coverage Report
|
||||
|
||||
- Source and script files inspected: [count]
|
||||
- Test files inspected: [count]
|
||||
- Coverage gaps:
|
||||
- `[path/to/file]` — [missing coverage detail]
|
||||
- `[path/to/other]` — [missing coverage detail]
|
||||
|
||||
## Security Audit Findings
|
||||
|
||||
- `[severity]` `[path:line]` — [risk category]: [detail]. Evidence: `[snippet]`
|
||||
- `[severity]` `[path:line]` — [risk category]: [detail]. Evidence: `[snippet]`
|
||||
|
||||
## Dead Code Candidates
|
||||
|
||||
- `[path/to/file]` — [why it appears unreferenced]
|
||||
- `[path/to/other]` — [why it appears unreferenced]
|
||||
|
||||
## Performance Bottleneck Analysis
|
||||
|
||||
- `[path/to/file]` — [why runtime or scale could degrade here]
|
||||
- `[path/to/other]` — [filesystem scan / network / large module / hot path detail]
|
||||
@@ -1,122 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import importlib.util
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
ROOT = Path(__file__).resolve().parents[1]
|
||||
SCRIPT_PATH = ROOT / "scripts" / "fleet_config_sync.py"
|
||||
HOSTS_FILE = ROOT / "ansible" / "inventory" / "hosts.ini"
|
||||
EXAMPLE_MANIFEST = ROOT / "docs" / "fleet-config-sync.example.yaml"
|
||||
|
||||
|
||||
def _load_module(path: Path, name: str):
|
||||
assert path.exists(), f"missing {path.relative_to(ROOT)}"
|
||||
spec = importlib.util.spec_from_file_location(name, path)
|
||||
assert spec and spec.loader
|
||||
module = importlib.util.module_from_spec(spec)
|
||||
spec.loader.exec_module(module)
|
||||
return module
|
||||
|
||||
|
||||
def _write_source_tree(tmp_path: Path) -> Path:
|
||||
source_root = tmp_path / "source"
|
||||
(source_root / "dispatch").mkdir(parents=True)
|
||||
(source_root / "config.yaml").write_text("model: local\nroute: hybrid\n", encoding="utf-8")
|
||||
(source_root / "dispatch" / "rules.json").write_text('{"lane":"allegro"}\n', encoding="utf-8")
|
||||
return source_root
|
||||
|
||||
|
||||
def test_example_manifest_targets_known_fleet_hosts() -> None:
|
||||
mod = _load_module(SCRIPT_PATH, "fleet_config_sync")
|
||||
assert EXAMPLE_MANIFEST.exists(), "missing docs/fleet-config-sync.example.yaml"
|
||||
|
||||
inventory = mod.load_inventory_hosts(HOSTS_FILE)
|
||||
manifest = mod.load_manifest(EXAMPLE_MANIFEST)
|
||||
mod.validate_manifest(manifest, inventory)
|
||||
|
||||
assert [target["host"] for target in manifest["targets"]] == ["ezra", "bezalel"]
|
||||
|
||||
|
||||
def test_build_rollout_plan_stages_one_release_for_all_hosts(tmp_path: Path) -> None:
|
||||
mod = _load_module(SCRIPT_PATH, "fleet_config_sync")
|
||||
source_root = _write_source_tree(tmp_path)
|
||||
inventory = {
|
||||
"ezra": {"host": "ezra", "ansible_host": "143.198.27.163"},
|
||||
"bezalel": {"host": "bezalel", "ansible_host": "67.205.155.108"},
|
||||
}
|
||||
manifest = {
|
||||
"fleet_name": "phase-3-config-sync",
|
||||
"targets": [
|
||||
{
|
||||
"host": "ezra",
|
||||
"config_root": "/root/wizards/ezra/home/.hermes",
|
||||
"files": ["config.yaml", "dispatch/rules.json"],
|
||||
},
|
||||
{
|
||||
"host": "bezalel",
|
||||
"config_root": "/root/wizards/bezalel/home/.hermes",
|
||||
"files": ["config.yaml", "dispatch/rules.json"],
|
||||
},
|
||||
],
|
||||
}
|
||||
|
||||
plan = mod.build_rollout_plan(manifest, inventory, source_root=source_root)
|
||||
|
||||
assert plan["fleet_name"] == "phase-3-config-sync"
|
||||
assert len(plan["release_id"]) == 12
|
||||
assert plan["target_count"] == 2
|
||||
assert plan["file_count"] == 4
|
||||
assert plan["total_bytes"] > 0
|
||||
|
||||
for target in plan["targets"]:
|
||||
assert target["stage_root"].endswith(f"/.releases/{plan['release_id']}")
|
||||
assert target["live_symlink"].endswith("/current")
|
||||
assert target["promote"]["release_id"] == plan["release_id"]
|
||||
assert {item["relative_path"] for item in target["files"]} == {"config.yaml", "dispatch/rules.json"}
|
||||
|
||||
|
||||
def test_validate_manifest_rejects_unknown_inventory_host(tmp_path: Path) -> None:
|
||||
mod = _load_module(SCRIPT_PATH, "fleet_config_sync")
|
||||
source_root = _write_source_tree(tmp_path)
|
||||
manifest = {
|
||||
"targets": [
|
||||
{
|
||||
"host": "unknown-wizard",
|
||||
"config_root": "/srv/wizard/config",
|
||||
"files": ["config.yaml"],
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
try:
|
||||
mod.build_rollout_plan(manifest, {"ezra": {"host": "ezra"}}, source_root=source_root)
|
||||
except ValueError as exc:
|
||||
assert "unknown inventory host" in str(exc)
|
||||
assert "unknown-wizard" in str(exc)
|
||||
else:
|
||||
raise AssertionError("build_rollout_plan should reject unknown inventory hosts")
|
||||
|
||||
|
||||
def test_render_markdown_mentions_atomic_promote_and_targets(tmp_path: Path) -> None:
|
||||
mod = _load_module(SCRIPT_PATH, "fleet_config_sync")
|
||||
source_root = _write_source_tree(tmp_path)
|
||||
manifest = {
|
||||
"fleet_name": "phase-3-config-sync",
|
||||
"targets": [
|
||||
{
|
||||
"host": "ezra",
|
||||
"config_root": "/root/wizards/ezra/home/.hermes",
|
||||
"files": ["config.yaml"],
|
||||
}
|
||||
],
|
||||
}
|
||||
inventory = {"ezra": {"host": "ezra", "ansible_host": "143.198.27.163"}}
|
||||
|
||||
plan = mod.build_rollout_plan(manifest, inventory, source_root=source_root)
|
||||
report = mod.render_markdown(plan)
|
||||
|
||||
assert plan["release_id"] in report
|
||||
assert "Atomic promote via symlink swap" in report
|
||||
assert "ezra" in report
|
||||
assert "/root/wizards/ezra/home/.hermes/current" in report
|
||||
37
tests/test_issue_666_genome_template.py
Normal file
37
tests/test_issue_666_genome_template.py
Normal file
@@ -0,0 +1,37 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
ROOT = Path(__file__).resolve().parents[1]
|
||||
TEMPLATE_PATH = ROOT / "templates" / "GENOME-template.md"
|
||||
DOC_PATH = ROOT / "docs" / "CODEBASE_GENOME_PIPELINE.md"
|
||||
|
||||
|
||||
REQUIRED_HEADINGS = (
|
||||
"# GENOME.md — [org/repo]",
|
||||
"## Project Overview",
|
||||
"## Architecture",
|
||||
"## Entry Points",
|
||||
"## Data Flow",
|
||||
"## Key Abstractions",
|
||||
"## API Surface",
|
||||
"## Test Coverage Report",
|
||||
"## Security Audit Findings",
|
||||
"## Dead Code Candidates",
|
||||
"## Performance Bottleneck Analysis",
|
||||
)
|
||||
|
||||
|
||||
def test_issue_666_template_exists_and_covers_required_sections() -> None:
|
||||
assert TEMPLATE_PATH.exists(), "missing templates/GENOME-template.md"
|
||||
text = TEMPLATE_PATH.read_text(encoding="utf-8")
|
||||
for heading in REQUIRED_HEADINGS:
|
||||
assert heading in text
|
||||
|
||||
|
||||
def test_issue_666_docs_reference_template_and_single_repo_entrypoint() -> None:
|
||||
text = DOC_PATH.read_text(encoding="utf-8")
|
||||
assert "templates/GENOME-template.md" in text
|
||||
assert "python3 pipelines/codebase_genome.py" in text
|
||||
assert "python3 pipelines/codebase-genome.py" in text
|
||||
Reference in New Issue
Block a user