Compare commits
1 Commits
fix/665
...
step35/459
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
682d39ee15 |
299
GENOME.md
299
GENOME.md
@@ -1,144 +1,209 @@
|
||||
# GENOME.md — Timmy_Foundation/timmy-home
|
||||
|
||||
Generated by `pipelines/codebase_genome.py`.
|
||||
# GENOME.md — the-nexus
|
||||
|
||||
## Project Overview
|
||||
|
||||
Timmy Foundation's home repository for development operations and configurations.
|
||||
`the-nexus` is a hybrid repo that combines three layers in one codebase:
|
||||
|
||||
- Text files indexed: 3181
|
||||
- Source and script files: 231
|
||||
- Test files: 95
|
||||
- Documentation files: 755
|
||||
1. A browser-facing world shell rooted in `index.html`, `boot.js`, `bootstrap.mjs`, `app.js`, `style.css`, `portals.json`, `vision.json`, `manifest.json`, and `gofai_worker.js`
|
||||
2. A Python realtime bridge centered on `server.py` plus harness code under `nexus/`
|
||||
3. A memory / fleet / operator layer spanning `mempalace/`, `mcp_servers/`, `multi_user_bridge.py`, and supporting scripts
|
||||
|
||||
## Architecture
|
||||
The repo is not a clean single-purpose frontend and not just a backend harness. It is a mixed world/runtime/ops repository where browser rendering, WebSocket telemetry, MCP-driven game harnesses, and fleet memory tooling coexist.
|
||||
|
||||
Grounded repo facts from this checkout:
|
||||
- Browser shell files exist at repo root: `index.html`, `app.js`, `style.css`, `manifest.json`, `gofai_worker.js`
|
||||
- Data/config files also live at repo root: `portals.json`, `vision.json`
|
||||
- Realtime bridge exists in `server.py`
|
||||
- Game harnesses exist in `nexus/morrowind_harness.py` and `nexus/bannerlord_harness.py`
|
||||
- Memory/fleet sync exists in `mempalace/tunnel_sync.py`
|
||||
- Desktop/game automation MCP servers exist in `mcp_servers/desktop_control_server.py` and `mcp_servers/steam_info_server.py`
|
||||
- Validation exists in `tests/test_browser_smoke.py`, `tests/test_portals_json.py`, `tests/test_index_html_integrity.py`, and `tests/test_repo_truth.py`
|
||||
|
||||
The current architecture is best understood as a sovereign world shell plus operator/game harness backend, with accumulated documentation drift from multiple restoration and migration efforts.
|
||||
|
||||
## Architecture Diagram
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
repo_root["repo"]
|
||||
angband["angband"]
|
||||
ansible["ansible"]
|
||||
briefings["briefings"]
|
||||
codebase_genome["codebase_genome"]
|
||||
config["config"]
|
||||
configs["configs"]
|
||||
conftest["conftest"]
|
||||
dns_records["dns-records"]
|
||||
evennia["evennia"]
|
||||
evennia_tools["evennia_tools"]
|
||||
repo_root --> angband
|
||||
repo_root --> ansible
|
||||
repo_root --> briefings
|
||||
repo_root --> codebase_genome
|
||||
repo_root --> config
|
||||
repo_root --> configs
|
||||
browser[Index HTML Shell\nindex.html -> boot.js -> bootstrap.mjs -> app.js]
|
||||
assets[Root Assets\nstyle.css\nmanifest.json\ngofai_worker.js]
|
||||
data[World Data\nportals.json\nvision.json]
|
||||
ws[Realtime Bridge\nserver.py\nWebSocket broadcast hub]
|
||||
gofai[In-browser GOFAI\nSymbolicEngine\nNeuroSymbolicBridge\nsetupGOFAI/updateGOFAI]
|
||||
harnesses[Python Harnesses\nnexus/morrowind_harness.py\nnexus/bannerlord_harness.py]
|
||||
mcp[MCP Adapters\nmcp_servers/desktop_control_server.py\nmcp_servers/steam_info_server.py]
|
||||
memory[Memory + Fleet\nmempalace/tunnel_sync.py\nmempalace.js]
|
||||
bridge[Operator / MUD Bridge\nmulti_user_bridge.py\ncommands/timmy_commands.py]
|
||||
tests[Verification\ntests/test_browser_smoke.py\ntests/test_portals_json.py\ntests/test_repo_truth.py]
|
||||
docs[Contracts + Drift Docs\nBROWSER_CONTRACT.md\nREADME.md\nCLAUDE.md\nINVESTIGATION_ISSUE_1145.md]
|
||||
|
||||
browser --> assets
|
||||
browser --> data
|
||||
browser --> gofai
|
||||
browser --> ws
|
||||
harnesses --> mcp
|
||||
harnesses --> ws
|
||||
bridge --> ws
|
||||
memory --> ws
|
||||
tests --> browser
|
||||
tests --> data
|
||||
tests --> docs
|
||||
docs --> browser
|
||||
```
|
||||
|
||||
## Entry Points
|
||||
## Entry Points and Data Flow
|
||||
|
||||
- `codebase_genome.py` — python main guard (`python3 codebase_genome.py`)
|
||||
- `gemini-fallback-setup.sh` — operational script (`bash gemini-fallback-setup.sh`)
|
||||
- `morrowind/hud.sh` — operational script (`bash morrowind/hud.sh`)
|
||||
- `pipelines/codebase_genome.py` — python main guard (`python3 pipelines/codebase_genome.py`)
|
||||
- `scripts/agent_pr_gate.py` — operational script (`python3 scripts/agent_pr_gate.py`)
|
||||
- `scripts/audit_trail.py` — operational script (`python3 scripts/audit_trail.py`)
|
||||
- `scripts/auto_restart_agent.sh` — operational script (`bash scripts/auto_restart_agent.sh`)
|
||||
- `scripts/autonomous_issue_creator.py` — operational script (`python3 scripts/autonomous_issue_creator.py`)
|
||||
- `scripts/backlog_cleanup.py` — operational script (`python3 scripts/backlog_cleanup.py`)
|
||||
- `scripts/backlog_triage.py` — operational script (`python3 scripts/backlog_triage.py`)
|
||||
- `scripts/backlog_triage_cron.sh` — operational script (`bash scripts/backlog_triage_cron.sh`)
|
||||
- `scripts/backup_pipeline.sh` — operational script (`bash scripts/backup_pipeline.sh`)
|
||||
### Primary entry points
|
||||
|
||||
## Data Flow
|
||||
- `index.html` — root browser entry point
|
||||
- `boot.js` — startup selector; `tests/boot.test.js` shows it chooses file-mode vs HTTP/module-mode and injects `bootstrap.mjs` when served over HTTP
|
||||
- `bootstrap.mjs` — module bootstrap for the browser shell
|
||||
- `app.js` — main browser runtime; owns world state, GOFAI wiring, metrics polling, and portal/UI logic
|
||||
- `server.py` — WebSocket broadcast bridge on `ws://0.0.0.0:8765`
|
||||
- `nexus/morrowind_harness.py` — GamePortal/MCP harness for OpenMW Morrowind
|
||||
- `nexus/bannerlord_harness.py` — GamePortal/MCP harness for Bannerlord
|
||||
- `mempalace/tunnel_sync.py` — pulls remote fleet closets into the local palace over HTTP
|
||||
- `multi_user_bridge.py` — HTTP bridge for multi-user chat/session integration
|
||||
- `mcp_servers/desktop_control_server.py` — stdio MCP server exposing screenshots/mouse/keyboard control
|
||||
|
||||
1. Operators enter through `codebase_genome.py`, `gemini-fallback-setup.sh`, `morrowind/hud.sh`.
|
||||
2. Core logic fans into top-level components: `angband`, `ansible`, `briefings`, `codebase_genome`, `config`, `configs`.
|
||||
3. Validation is incomplete around `wizards/allegro/home/skills/red-teaming/godmode/scripts/auto_jailbreak.py`, `timmy-local/cache/agent_cache.py`, `wizards/allegro/home/skills/red-teaming/godmode/scripts/parseltongue.py`, so changes there carry regression risk.
|
||||
4. Final artifacts land as repository files, docs, or runtime side effects depending on the selected entry point.
|
||||
### Data flow
|
||||
|
||||
1. Browser startup begins at `index.html`
|
||||
2. `boot.js` decides whether the page is being served correctly; in HTTP mode it injects `bootstrap.mjs`
|
||||
3. `bootstrap.mjs` hands off to `app.js`
|
||||
4. `app.js` loads world configuration from `portals.json` and `vision.json`
|
||||
5. `app.js` constructs the Three.js scene and in-browser reasoning components, including `SymbolicEngine`, `NeuroSymbolicBridge`, `setupGOFAI()`, and `updateGOFAI()`
|
||||
6. Browser state and external runtimes connect through `server.py`, which broadcasts messages between connected clients
|
||||
7. Python harnesses (`nexus/morrowind_harness.py`, `nexus/bannerlord_harness.py`) spawn MCP subprocesses for desktop control / Steam metadata, capture state, execute actions, and feed telemetry into the Nexus bridge
|
||||
8. Memory/fleet tools like `mempalace/tunnel_sync.py` import remote palace data into local closets, extending what the operator/runtime layers can inspect
|
||||
9. Tests validate both the static browser contract and the higher-level repo-truth/memory contracts
|
||||
|
||||
### Important repo-specific runtime facts
|
||||
|
||||
- `portals.json` is a JSON array of portal/world/operator entries; examples in this checkout include `morrowind`, `bannerlord`, `workshop`, `archive`, `chapel`, and `courtyard`
|
||||
- `server.py` is a plain broadcast hub: clients send messages, the server forwards them to other connected clients
|
||||
- `nexus/morrowind_harness.py` and `nexus/bannerlord_harness.py` both implement a GamePortal pattern with MCP subprocess clients over stdio and WebSocket telemetry uplink
|
||||
- `mempalace/tunnel_sync.py` is not speculative; it is a real client that discovers remote wings, searches remote rooms, and writes `.closet.json` payloads locally
|
||||
|
||||
## Key Abstractions
|
||||
|
||||
- `codebase_genome.py` — classes `FunctionInfo`:19; functions `extract_functions()`:58, `generate_test()`:116, `scan_repo()`:191, `find_existing_tests()`:209, `main()`:231
|
||||
- `evennia/timmy_world/game.py` — classes `World`:91, `ActionSystem`:421, `TimmyAI`:539, `NPCAI`:550; functions `get_narrative_phase()`:55, `get_phase_transition_event()`:65
|
||||
- `evennia/timmy_world/world/game.py` — classes `World`:19, `ActionSystem`:326, `TimmyAI`:444, `NPCAI`:455; functions none detected
|
||||
- `timmy-world/game.py` — classes `World`:19, `ActionSystem`:349, `TimmyAI`:467, `NPCAI`:478; functions none detected
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/auto_jailbreak.py` — classes none detected; functions none detected
|
||||
- `uniwizard/self_grader.py` — classes `SessionGrade`:23, `WeeklyReport`:55, `SelfGrader`:74; functions `main()`:713
|
||||
- `uni-wizard/v3/intelligence_engine.py` — classes `ExecutionPattern`:27, `ModelPerformance`:44, `AdaptationEvent`:58, `PatternDatabase`:69; functions none detected
|
||||
- `scripts/know_thy_father/crossref_audit.py` — classes `ThemeCategory`:30, `Principle`:160, `MeaningKernel`:169, `CrossRefFinding`:178; functions `extract_themes_from_text()`:192, `parse_soul_md()`:206, `parse_kernels()`:264, `cross_reference()`:296, `generate_report()`:440, `main()`:561
|
||||
### Browser runtime
|
||||
|
||||
- `app.js`
|
||||
- Defines in-browser reasoning/state machinery, including `class SymbolicEngine`, `class NeuroSymbolicBridge`, `setupGOFAI()`, and `updateGOFAI()`
|
||||
- Couples rendering, local symbolic reasoning, metrics polling, and portal/UI logic in one very large root module
|
||||
- `BROWSER_CONTRACT.md`
|
||||
- Acts like an executable architecture contract for the browser surface
|
||||
- Declares required files, DOM IDs, Three.js expectations, provenance rules, and WebSocket expectations
|
||||
|
||||
### Realtime bridge
|
||||
|
||||
- `server.py`
|
||||
- Single hub abstraction: a WebSocket broadcast server maintaining a `clients` set and forwarding messages from one client to the others
|
||||
- This is the seam between browser shell, harnesses, and external telemetry producers
|
||||
|
||||
### GamePortal harness layer
|
||||
|
||||
- `nexus/morrowind_harness.py`
|
||||
- `nexus/bannerlord_harness.py`
|
||||
- Both define MCP client wrappers, `GameState` / `ActionResult`-style data classes, and an Observe-Decide-Act telemetry loop
|
||||
- The harnesses are symmetric enough to be understood as reusable portal adapters with game-specific context injected on top
|
||||
|
||||
### Memory / fleet layer
|
||||
|
||||
- `mempalace/tunnel_sync.py`
|
||||
- Encodes the fleet-memory sync client contract: discover wings, pull broad room queries, write closet files, support dry-run
|
||||
- `mempalace.js`
|
||||
- Minimal browser/Electron bridge to MemPalace commands via `window.electronAPI.execPython(...)`
|
||||
- Important because it shows a second memory integration surface distinct from the Python fleet sync path
|
||||
|
||||
### Operator / interaction bridge
|
||||
|
||||
- `multi_user_bridge.py`
|
||||
- `commands/timmy_commands.py`
|
||||
- These bridge user-facing conversations or MUD/Evennia interactions back into Timmy/Nexus services
|
||||
|
||||
## API Surface
|
||||
|
||||
- CLI: `python3 codebase_genome.py` — python main guard (`codebase_genome.py`)
|
||||
- CLI: `bash gemini-fallback-setup.sh` — operational script (`gemini-fallback-setup.sh`)
|
||||
- CLI: `bash morrowind/hud.sh` — operational script (`morrowind/hud.sh`)
|
||||
- CLI: `python3 pipelines/codebase_genome.py` — python main guard (`pipelines/codebase_genome.py`)
|
||||
- CLI: `python3 scripts/agent_pr_gate.py` — operational script (`scripts/agent_pr_gate.py`)
|
||||
- CLI: `python3 scripts/audit_trail.py` — operational script (`scripts/audit_trail.py`)
|
||||
- CLI: `bash scripts/auto_restart_agent.sh` — operational script (`scripts/auto_restart_agent.sh`)
|
||||
- CLI: `python3 scripts/autonomous_issue_creator.py` — operational script (`scripts/autonomous_issue_creator.py`)
|
||||
- Python: `extract_functions()` from `codebase_genome.py:58`
|
||||
- Python: `generate_test()` from `codebase_genome.py:116`
|
||||
- Python: `scan_repo()` from `codebase_genome.py:191`
|
||||
- Python: `find_existing_tests()` from `codebase_genome.py:209`
|
||||
- Python: `main()` from `codebase_genome.py:231`
|
||||
- Python: `get_narrative_phase()` from `evennia/timmy_world/game.py:55`
|
||||
### Browser / static surface
|
||||
|
||||
## Test Coverage Report
|
||||
- `index.html` served over HTTP
|
||||
- `boot.js` exports `bootPage()`; verified by `node --test tests/boot.test.js`
|
||||
- Data APIs are file-based inside the repo: `portals.json`, `vision.json`, `manifest.json`
|
||||
|
||||
- Source and script files inspected: 231
|
||||
- Test files inspected: 95
|
||||
- Coverage gaps:
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/auto_jailbreak.py` — no matching test reference detected
|
||||
- `timmy-local/cache/agent_cache.py` — no matching test reference detected
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/parseltongue.py` — no matching test reference detected
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/godmode_race.py` — no matching test reference detected
|
||||
- `skills/productivity/google-workspace/scripts/google_api.py` — no matching test reference detected
|
||||
- `wizards/allegro/home/skills/productivity/google-workspace/scripts/google_api.py` — no matching test reference detected
|
||||
- `morrowind/pilot.py` — no matching test reference detected
|
||||
- `scripts/sovereignty_audit.py` — no matching test reference detected
|
||||
- `skills/research/domain-intel/scripts/domain_intel.py` — no matching test reference detected
|
||||
- `wizards/allegro/home/skills/research/domain-intel/scripts/domain_intel.py` — no matching test reference detected
|
||||
- `timmy-local/scripts/ingest.py` — no matching test reference detected
|
||||
- `uni-wizard/scripts/generate_scorecard.py` — no matching test reference detected
|
||||
### Network/runtime surface
|
||||
|
||||
## Security Audit Findings
|
||||
- `python3 server.py`
|
||||
- Starts the WebSocket bridge on port `8765`
|
||||
- `python3 l402_server.py`
|
||||
- Local HTTP microservice for cost-estimate style responses
|
||||
- `python3 multi_user_bridge.py`
|
||||
- Multi-user HTTP/chat bridge
|
||||
|
||||
- [medium] `briefings/briefing_20260325.json:37` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `"gitea_error": "Gitea 404: {\"errors\":null,\"message\":\"not found\",\"url\":\"http://143.198.27.163:3000/api/swagger\"}\n [http://143.198.27.163:3000/api/v1/repos/Timmy_Foundation/sovereign-orchestration/issues?state=open&type=issues&sort=created&direction=desc&limit=1&page=1]",`
|
||||
- [medium] `briefings/briefing_20260328.json:11` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `"provider_base_url": "http://localhost:8081/v1",`
|
||||
- [medium] `briefings/briefing_20260329.json:11` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `"provider_base_url": "http://localhost:8081/v1",`
|
||||
- [medium] `config.yaml:37` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `summary_base_url: http://localhost:11434/v1`
|
||||
- [medium] `config.yaml:47` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:52` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:57` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:62` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:67` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:77` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:82` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: 'http://localhost:11434/v1'`
|
||||
- [medium] `config.yaml:174` — hardcoded http endpoint: plaintext or fixed HTTP endpoints can drift or leak across environments. Evidence: `base_url: http://localhost:11434/v1`
|
||||
### Harness / operator CLI surfaces
|
||||
|
||||
## Dead Code Candidates
|
||||
- `python3 nexus/morrowind_harness.py`
|
||||
- `python3 nexus/bannerlord_harness.py`
|
||||
- `python3 mempalace/tunnel_sync.py --peer <url> [--dry-run] [--n N]`
|
||||
- `python3 mcp_servers/desktop_control_server.py`
|
||||
- `python3 mcp_servers/steam_info_server.py`
|
||||
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/auto_jailbreak.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `timmy-local/cache/agent_cache.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/parseltongue.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `wizards/allegro/home/skills/red-teaming/godmode/scripts/godmode_race.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `skills/productivity/google-workspace/scripts/google_api.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `wizards/allegro/home/skills/productivity/google-workspace/scripts/google_api.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `morrowind/pilot.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `scripts/sovereignty_audit.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `skills/research/domain-intel/scripts/domain_intel.py` — not imported by indexed Python modules and not referenced by tests
|
||||
- `wizards/allegro/home/skills/research/domain-intel/scripts/domain_intel.py` — not imported by indexed Python modules and not referenced by tests
|
||||
### Validation surface
|
||||
|
||||
## Performance Bottleneck Analysis
|
||||
- `python3 -m pytest tests/test_portals_json.py tests/test_index_html_integrity.py tests/test_repo_truth.py -q`
|
||||
- `node --test tests/boot.test.js`
|
||||
- `python3 -m py_compile server.py nexus/morrowind_harness.py nexus/bannerlord_harness.py mempalace/tunnel_sync.py mcp_servers/desktop_control_server.py`
|
||||
- `tests/test_browser_smoke.py` defines the higher-cost Playwright smoke contract for the world shell
|
||||
|
||||
- `angband/mcp_server.py` — large module (353 lines) likely hides multiple responsibilities
|
||||
- `evennia/timmy_world/game.py` — large module (1541 lines) likely hides multiple responsibilities
|
||||
- `evennia/timmy_world/world/game.py` — large module (1345 lines) likely hides multiple responsibilities
|
||||
- `morrowind/mcp_server.py` — large module (451 lines) likely hides multiple responsibilities
|
||||
- `morrowind/pilot.py` — large module (459 lines) likely hides multiple responsibilities
|
||||
- `pipelines/codebase_genome.py` — large module (557 lines) likely hides multiple responsibilities
|
||||
- `scripts/fleet_progression.py` — large module (361 lines) likely hides multiple responsibilities
|
||||
- `scripts/know_thy_father/crossref_audit.py` — large module (657 lines) likely hides multiple responsibilities
|
||||
- `scripts/know_thy_father/index_media.py` — large module (405 lines) likely hides multiple responsibilities
|
||||
- `scripts/know_thy_father/synthesize_kernels.py` — large module (416 lines) likely hides multiple responsibilities
|
||||
## Test Coverage Gaps
|
||||
|
||||
Strongly covered in this checkout:
|
||||
- `tests/test_portals_json.py` validates `portals.json`
|
||||
- `tests/test_index_html_integrity.py` checks merge-marker/DOM-integrity regressions in `index.html`
|
||||
- `tests/boot.test.js` verifies `boot.js` startup behavior
|
||||
- `tests/test_repo_truth.py` validates the repo-truth documents
|
||||
- Multiple `tests/test_mempalace_*.py` files cover the palace layer
|
||||
- `tests/test_bannerlord_harness.py` exists for the Bannerlord harness
|
||||
|
||||
Notable gaps or weak seams:
|
||||
- `nexus/morrowind_harness.py` is large and operationally critical, but the generated baseline still flags it as a gap relative to its size/complexity
|
||||
- `mcp_servers/desktop_control_server.py` exposes high-power automation but has no obvious dedicated test file in the root `tests/` suite
|
||||
- `app.js` is the dominant browser runtime file and mixes rendering, GOFAI, metrics, and integration logic in one place; browser smoke exists, but there is limited unit-level decomposition around those subsystems
|
||||
- `mempalace.js` appears minimally bridged and stale relative to the richer Python MemPalace layer
|
||||
- `multi_user_bridge.py` is a large integration surface and should be treated as high regression risk even though it is central to operator/chat flow
|
||||
|
||||
## Security Considerations
|
||||
|
||||
- `server.py` binds `HOST = "0.0.0.0"`, exposing the broadcast bridge beyond localhost unless network controls limit it
|
||||
- The WebSocket bridge is a broadcast hub without visible authentication in `server.py`; connected clients are trusted to send messages into the bus
|
||||
- `mcp_servers/desktop_control_server.py` exposes mouse/keyboard/screenshot control through a stdio MCP server. In any non-local or poorly isolated runtime, this is a privileged automation surface
|
||||
- `app.js` contains hardcoded local/network endpoints such as `http://localhost:${L402_PORT}/api/cost-estimate` and `http://localhost:8082/metrics`; these are convenient for local development but create environment drift and deployment assumptions
|
||||
- `app.js` also embeds explicit endpoint/status references like `ws://143.198.27.163:8765`, which is operationally brittle and the kind of hardcoded location data that drifts across environments
|
||||
- `mempalace.js` shells out through `window.electronAPI.execPython(...)`; this is powerful and useful, but it is a clear trust boundary between UI and host execution
|
||||
- `INVESTIGATION_ISSUE_1145.md` documents an earlier integrity hazard: agents writing to `public/nexus/` instead of canonical root paths. That path confusion is both an operational and security concern because it makes provenance harder to reason about
|
||||
|
||||
## Runtime Truth and Docs Drift
|
||||
|
||||
The most important architecture finding in this repo is not a class or subsystem. It is a truth mismatch.
|
||||
|
||||
- README.md says current `main` does not ship a browser 3D world
|
||||
- CLAUDE.md declares root `app.js` and `index.html` as canonical frontend paths
|
||||
- tests and browser contract now assume the root frontend exists
|
||||
|
||||
All three statements are simultaneously present in this checkout.
|
||||
|
||||
Grounded evidence:
|
||||
- `README.md` still says the repo does not contain an active root frontend such as `index.html`, `app.js`, or `style.css`
|
||||
- the current checkout does contain `index.html`, `app.js`, `style.css`, `manifest.json`, and `gofai_worker.js`
|
||||
- `BROWSER_CONTRACT.md` explicitly treats those root files as required browser assets
|
||||
- `tests/test_browser_smoke.py` serves those exact files and validates DOM/WebGL contracts against them
|
||||
- `tests/test_index_html_integrity.py` assumes `index.html` is canonical and production-relevant
|
||||
- `CLAUDE.md` says frontend code lives at repo root and explicitly warns against `public/nexus/`
|
||||
- `INVESTIGATION_ISSUE_1145.md` explains why `public/nexus/` is a bad/corrupt duplicate path and confirms the real classical AI code lives in root `app.js`
|
||||
|
||||
The honest conclusion:
|
||||
- The repo contains a partially restored or actively re-materialized browser surface
|
||||
- The docs are preserving an older migration truth while the runtime files and smoke contracts describe a newer present-tense truth
|
||||
- Any future work in `the-nexus` must choose one truth and align `README.md`, `CLAUDE.md`, smoke tests, and file layout around it
|
||||
|
||||
That drift is itself a critical architectural fact and should be treated as first-order design debt, not a side note.
|
||||
|
||||
@@ -1,14 +0,0 @@
|
||||
---
|
||||
- name: Codebase Genome Nightly
|
||||
schedule: '30 2 * * *' # Daily at 02:30 local time
|
||||
tasks:
|
||||
- name: Ensure output and log directories exist
|
||||
shell: "mkdir -p ~/.timmy/codebase-genomes ~/.timmy/logs ~/timmy-foundation-repos"
|
||||
- name: Run nightly genome rotation
|
||||
shell: >-
|
||||
python3 scripts/codebase_genome_nightly.py
|
||||
--org Timmy_Foundation
|
||||
--workspace-root ~/timmy-foundation-repos
|
||||
--output-root ~/.timmy/codebase-genomes
|
||||
--state-path ~/.timmy/codebase_genome_state.json
|
||||
>> ~/.timmy/logs/codebase_genome_nightly.log 2>&1
|
||||
@@ -169,6 +169,14 @@ _config_version: 9
|
||||
session_reset:
|
||||
mode: none
|
||||
idle_minutes: 0
|
||||
blackboard:
|
||||
enabled: true
|
||||
redis:
|
||||
url: redis://localhost:6379/0
|
||||
password: ""
|
||||
keyspace_prefix: timmy
|
||||
ttl_seconds: 3600
|
||||
fallback_to_memory: true
|
||||
custom_providers:
|
||||
- name: Local Ollama
|
||||
base_url: http://localhost:11434/v1
|
||||
|
||||
@@ -10,8 +10,6 @@ This pipeline gives Timmy a repeatable way to generate a deterministic `GENOME.m
|
||||
- `pipelines/codebase-genome.py` — thin CLI wrapper matching the expected pipeline-style entrypoint
|
||||
- `scripts/codebase_genome_nightly.py` — org-aware nightly runner that selects the next repo, updates a local checkout, and writes the genome artifact
|
||||
- `scripts/codebase_genome_status.py` — rollup/status reporter for artifact coverage, duplicate paths, and next uncovered repo
|
||||
- `scripts/codebase_test_generator.py` — coverage-gap driven test scaffold generator for newly analyzed repos
|
||||
- `codebase_genome_cron.yml` — checked-in nightly cron spec for the rotating genome pass
|
||||
- `GENOME.md` — generated analysis for `timmy-home` itself
|
||||
|
||||
## Genome output
|
||||
|
||||
19
infrastructure/redis/README.md
Normal file
19
infrastructure/redis/README.md
Normal file
@@ -0,0 +1,19 @@
|
||||
# Local Redis Blackboard for Agent Coordination
|
||||
|
||||
This directory contains the Redis deployment for the Timmy Home "Blackboard" — a
|
||||
shared coordination layer for multi-agent orchestration.
|
||||
|
||||
## Quick Start
|
||||
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
Redis will be available at `redis://localhost:6379` with persistence enabled.
|
||||
|
||||
## Stop
|
||||
|
||||
```bash
|
||||
docker-compose down # Stop, keep data
|
||||
docker-compose down -v # Stop and delete data
|
||||
```
|
||||
18
infrastructure/redis/docker-compose.yml
Normal file
18
infrastructure/redis/docker-compose.yml
Normal file
@@ -0,0 +1,18 @@
|
||||
version: '3.8'
|
||||
|
||||
services:
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
container_name: timmy-redis
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "6379:6379"
|
||||
volumes:
|
||||
- ./data:/data
|
||||
command: ["redis-server", "--appendonly", "yes"]
|
||||
networks:
|
||||
- timmy-network
|
||||
|
||||
networks:
|
||||
timmy-network:
|
||||
driver: bridge
|
||||
311
src/timmy/blackboard.py
Normal file
311
src/timmy/blackboard.py
Normal file
@@ -0,0 +1,311 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Blackboard — Redis-backed shared coordination layer.
|
||||
|
||||
Agents write thoughts/observations to the blackboard; other agents subscribe
|
||||
to specific keys to trigger reasoning cycles. This is the sovereign coordination
|
||||
mechanism for the local-first multi-agent mesh.
|
||||
|
||||
Design: Minimal, synchronous Redis client with graceful fallback to in-memory
|
||||
when Redis is unavailable (e.g., during local dev without Docker).
|
||||
|
||||
SOUL.md: "Sovereignty and service always." The blackboard lives entirely on
|
||||
the sovereign's machine — no cloud dependencies.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import time
|
||||
from dataclasses import dataclass, asdict
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Any, Callable, Iterable, Optional
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Lazy import to keep redis optional
|
||||
_redis = None
|
||||
_redis_import_error = None
|
||||
|
||||
try:
|
||||
import redis
|
||||
_redis = redis
|
||||
except ImportError as e:
|
||||
_redis_import_error = e
|
||||
|
||||
|
||||
@dataclass
|
||||
class BlackboardConfig:
|
||||
"""Configuration for the Blackboard."""
|
||||
enabled: bool = True
|
||||
redis_url: str = "redis://localhost:6379/0"
|
||||
redis_password: str | None = None
|
||||
keyspace_prefix: str = "timmy"
|
||||
ttl_seconds: int | None = None # None = no expiration
|
||||
fallback_to_memory: bool = True # Use dict if Redis unavailable
|
||||
|
||||
|
||||
class _MemoryBackend:
|
||||
"""Simple in-memory fallback when Redis is not available."""
|
||||
def __init__(self):
|
||||
self._store: dict[str, str] = {}
|
||||
self._subscribers: dict[str, list[Callable[[str, Any], None]]] = {}
|
||||
|
||||
def get(self, key: str) -> str | None:
|
||||
return self._store.get(key)
|
||||
|
||||
def set(self, key: str, value: str, ttl: int | None = None) -> bool:
|
||||
self._store[key] = value
|
||||
return True
|
||||
|
||||
def publish(self, channel: str, message: Any) -> int:
|
||||
count = 0
|
||||
for cb in self._subscribers.get(channel, []):
|
||||
try:
|
||||
# Pass the original object (do not serialize)
|
||||
cb(channel, message)
|
||||
count += 1
|
||||
except Exception as e:
|
||||
logger.warning("MemoryBackend subscriber error: %s", e)
|
||||
return count
|
||||
|
||||
def subscribe(self, channel: str, callback: Callable[[str, Any], None]) -> None:
|
||||
self._subscribers.setdefault(channel, []).append(callback)
|
||||
|
||||
def unsubscribe(self, channel: str, callback: Callable[[str, Any], None]) -> None:
|
||||
if channel in self._subscribers:
|
||||
self._subscribers[channel].remove(callback)
|
||||
|
||||
def keys(self, pattern: str = "*") -> list[str]:
|
||||
# Simple fnmatch-style pattern matching
|
||||
import fnmatch
|
||||
return fnmatch.filter(list(self._store.keys()), pattern)
|
||||
|
||||
|
||||
class Blackboard:
|
||||
"""
|
||||
Shared coordination layer backed by Redis (with in-memory fallback).
|
||||
|
||||
Usage:
|
||||
bb = Blackboard()
|
||||
bb.set("agent:timmy:thought", "checking queue...")
|
||||
value = bb.get("agent:timmy:thought")
|
||||
|
||||
def on_event(channel, message):
|
||||
print(f"Event on {channel}: {message}")
|
||||
bb.subscribe("dispatch:new", on_event)
|
||||
bb.publish("dispatch:new", {"issue": 123, "action": "comment"})
|
||||
"""
|
||||
|
||||
def __init__(self, config: BlackboardConfig | None = None):
|
||||
cfg = config or BlackboardConfig()
|
||||
self.enabled = cfg.enabled
|
||||
self.prefix = cfg.keyspace_prefix
|
||||
self.ttl = cfg.ttl_seconds
|
||||
self._backend: _MemoryBackend | Any
|
||||
|
||||
if not _redis:
|
||||
if cfg.fallback_to_memory:
|
||||
logger.warning(
|
||||
"redis-py not installed; using in-memory fallback. "
|
||||
"Install with: pip install redis"
|
||||
)
|
||||
self._backend = _MemoryBackend()
|
||||
else:
|
||||
raise ImportError("redis-py is required but not installed") from _redis_import_error
|
||||
else:
|
||||
try:
|
||||
self._backend = _redis.from_url(
|
||||
cfg.redis_url,
|
||||
password=cfg.redis_password,
|
||||
decode_responses=True,
|
||||
)
|
||||
# Test connection
|
||||
self._backend.ping()
|
||||
logger.info("Blackboard connected to Redis at %s", cfg.redis_url)
|
||||
except Exception as e:
|
||||
if cfg.fallback_to_memory:
|
||||
logger.warning("Redis connection failed (%s); falling back to in-memory", e)
|
||||
self._backend = _MemoryBackend()
|
||||
else:
|
||||
raise
|
||||
|
||||
# ─────────────────────────────────────────────
|
||||
# Key-value operations
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
def _prefixed(self, key: str) -> str:
|
||||
"""Apply keyspace prefix to a key."""
|
||||
return f"{self.prefix}:{key}" if self.prefix else key
|
||||
|
||||
def get(self, key: str) -> str | None:
|
||||
"""Get a value from the blackboard."""
|
||||
return self._backend.get(self._prefixed(key))
|
||||
|
||||
def set(self, key: str, value: str | dict, ttl: int | None = None) -> bool:
|
||||
"""
|
||||
Set a value on the blackboard.
|
||||
|
||||
Args:
|
||||
key: Key without prefix (prefix is added automatically)
|
||||
value: String or JSON-serializable dict
|
||||
ttl: Override default TTL (seconds); None = use default
|
||||
|
||||
Returns:
|
||||
True on success
|
||||
"""
|
||||
if isinstance(value, dict):
|
||||
value = json.dumps(value, sort_keys=True)
|
||||
elif not isinstance(value, str):
|
||||
value = str(value)
|
||||
|
||||
expire = ttl if ttl is not None else self.ttl
|
||||
result = self._backend.set(self._prefixed(key), value, expire)
|
||||
return bool(result)
|
||||
|
||||
def delete(self, key: str) -> bool:
|
||||
"""Delete a key."""
|
||||
try:
|
||||
return bool(self._backend.delete(self._prefixed(key)))
|
||||
except AttributeError:
|
||||
# MemoryBackend
|
||||
k = self._prefixed(key)
|
||||
if k in self._backend._store:
|
||||
del self._backend._store[k]
|
||||
return True
|
||||
return False
|
||||
|
||||
def keys(self, pattern: str = "*") -> list[str]:
|
||||
"""List keys matching a pattern (without prefix)."""
|
||||
full_pattern = self._prefixed(pattern)
|
||||
raw_keys = self._backend.keys(full_pattern)
|
||||
# Strip prefix
|
||||
prefix_len = len(self.prefix) + 1 if self.prefix else 0
|
||||
return [k[prefix_len:] if k.startswith(f"{self.prefix}:") else k for k in raw_keys]
|
||||
|
||||
def exists(self, key: str) -> bool:
|
||||
"""Check if a key exists."""
|
||||
try:
|
||||
return bool(self._backend.exists(self._prefixed(key)))
|
||||
except AttributeError:
|
||||
# MemoryBackend
|
||||
return self._prefixed(key) in self._backend._store
|
||||
|
||||
# ─────────────────────────────────────────────
|
||||
# Pub/sub operations
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
def publish(self, channel: str, message: Any) -> int:
|
||||
"""
|
||||
Publish a message to a channel.
|
||||
|
||||
Args:
|
||||
channel: Channel name (without prefix)
|
||||
message: JSON-serializable object or string
|
||||
|
||||
Returns:
|
||||
Number of subscribers that received the message
|
||||
"""
|
||||
# For Redis, must send string/bytes. For MemoryBackend, pass object.
|
||||
if isinstance(self._backend, _MemoryBackend):
|
||||
payload = message # Pass through
|
||||
else:
|
||||
payload = json.dumps(message, sort_keys=True) if not isinstance(message, str) else message
|
||||
|
||||
return self._backend.publish(self._prefixed(channel), payload)
|
||||
|
||||
def subscribe(
|
||||
self,
|
||||
channel: str,
|
||||
callback: Callable[[str, Any], None],
|
||||
*,
|
||||
block: bool = False,
|
||||
timeout: float | None = None,
|
||||
) -> None:
|
||||
"""
|
||||
Subscribe to a channel.
|
||||
|
||||
Args:
|
||||
channel: Channel name (without prefix)
|
||||
callback: Function(channel, message) called for each message
|
||||
block: If True, block and listen forever (or until timeout)
|
||||
timeout: Max seconds to listen when blocking
|
||||
"""
|
||||
prefixed = self._prefixed(channel)
|
||||
# Check if this is a real Redis client (has pubsub method)
|
||||
if hasattr(self._backend, 'pubsub') and callable(getattr(self._backend, 'pubsub', None)):
|
||||
# Real Redis pub/sub
|
||||
import threading
|
||||
pubsub = self._backend.pubsub()
|
||||
pubsub.subscribe(prefixed)
|
||||
|
||||
def listener():
|
||||
for msg in pubsub.listen():
|
||||
if msg['type'] == 'message':
|
||||
try:
|
||||
data = json.loads(msg['data'])
|
||||
except (json.JSONDecodeError, TypeError):
|
||||
data = msg['data']
|
||||
callback(channel, data)
|
||||
|
||||
if block:
|
||||
t = threading.Thread(target=listener, daemon=True)
|
||||
t.start()
|
||||
if timeout:
|
||||
t.join(timeout)
|
||||
else:
|
||||
t.join()
|
||||
else:
|
||||
# Fire-and-forget thread
|
||||
threading.Thread(target=listener, daemon=True).start()
|
||||
else:
|
||||
# MemoryBackend — synchronous callback registration
|
||||
self._backend.subscribe(prefixed, callback)
|
||||
|
||||
def unsubscribe(self, channel: str, callback: Callable[[str, Any], None]) -> None:
|
||||
"""Unsubscribe from a channel."""
|
||||
try:
|
||||
self._backend.unsubscribe(self._prefixed(channel), callback)
|
||||
except AttributeError:
|
||||
pass # MemoryBackend supports it
|
||||
|
||||
# ─────────────────────────────────────────────
|
||||
# Helpers
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
def clear_namespace(self, pattern: str = "*") -> int:
|
||||
"""Delete all keys matching pattern in this namespace."""
|
||||
full = self._prefixed(pattern)
|
||||
try:
|
||||
keys = self._backend.keys(full)
|
||||
if keys:
|
||||
return self._backend.delete(*keys)
|
||||
return 0
|
||||
except AttributeError:
|
||||
store_keys = list(self._backend._store.keys())
|
||||
import fnmatch
|
||||
matched = fnmatch.filter(store_keys, full)
|
||||
for k in matched:
|
||||
del self._backend._store[k]
|
||||
return len(matched)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return f"<Blackboard prefix={self.prefix!r} backend={type(self._backend).__name__}>"
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────
|
||||
# Convenience singleton for global use
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
_default_blackboard: Blackboard | None = None
|
||||
|
||||
|
||||
def get_blackboard(config: BlackboardConfig | None = None) -> Blackboard:
|
||||
"""Get or create the global Blackboard singleton."""
|
||||
global _default_blackboard
|
||||
if _default_blackboard is None:
|
||||
_default_blackboard = Blackboard(config)
|
||||
return _default_blackboard
|
||||
194
tests/test_blackboard.py
Normal file
194
tests/test_blackboard.py
Normal file
@@ -0,0 +1,194 @@
|
||||
"""
|
||||
Smoke tests for Blackboard — ensures the Redis-backed coordination layer
|
||||
works with both real Redis and in-memory fallback.
|
||||
"""
|
||||
|
||||
import json
|
||||
import time
|
||||
|
||||
import pytest
|
||||
|
||||
from src.timmy.blackboard import Blackboard, BlackboardConfig, _MemoryBackend
|
||||
|
||||
|
||||
class TestBlackboardBasics:
|
||||
"""Test core key-value operations."""
|
||||
|
||||
def test_kv_memory_backend(self):
|
||||
"""KV operations work using in-memory backend."""
|
||||
bb = Blackboard(BlackboardConfig(fallback_to_memory=True, enabled=True))
|
||||
|
||||
# Set and get
|
||||
assert bb.set("test:key", "hello") is True
|
||||
assert bb.get("test:key") == "hello"
|
||||
|
||||
# Dict serialization
|
||||
assert bb.set("test:obj", {"a": 1, "b": 2}) is True
|
||||
val = bb.get("test:obj")
|
||||
assert json.loads(val) == {"a": 1, "b": 2}
|
||||
|
||||
# Exists
|
||||
assert bb.exists("test:key") is True
|
||||
assert bb.exists("missing") is False
|
||||
|
||||
# Delete
|
||||
assert bb.delete("test:key") is True
|
||||
assert bb.get("test:key") is None
|
||||
|
||||
# Keys with prefix
|
||||
bb.set("agent:timmy:state", "ready")
|
||||
bb.set("agent:ezra:state", "idle")
|
||||
keys = bb.keys("agent:*:state")
|
||||
assert len(keys) == 2
|
||||
assert "timmy" in keys[0] or "ezra" in keys[0]
|
||||
|
||||
# Clear namespace
|
||||
assert bb.clear_namespace("agent:*") == 2
|
||||
assert bb.keys("agent:*") == []
|
||||
|
||||
|
||||
class TestBlackboardPubSub:
|
||||
"""Test pub/sub coordination patterns."""
|
||||
|
||||
def test_pubsub_memory_backend(self):
|
||||
"""Publish/subscribe works using in-memory backend."""
|
||||
bb = Blackboard(BlackboardConfig(fallback_to_memory=True, enabled=True))
|
||||
|
||||
received = []
|
||||
|
||||
def callback(channel, message):
|
||||
received.append((channel, message))
|
||||
|
||||
bb.subscribe("dispatch:new", callback)
|
||||
|
||||
# Publish
|
||||
count = bb.publish("dispatch:new", {"issue": 123, "action": "comment"})
|
||||
assert count == 1
|
||||
assert len(received) == 1
|
||||
ch, msg = received[0]
|
||||
assert ch == "dispatch:new"
|
||||
assert msg == {"issue": 123, "action": "comment"}
|
||||
|
||||
bb.unsubscribe("dispatch:new", callback)
|
||||
bb.publish("dispatch:new", {"should": "not arrive"})
|
||||
assert len(received) == 1 # no new messages
|
||||
|
||||
def test_publish_without_subscribers(self):
|
||||
"""Publish returns 0 when no subscribers."""
|
||||
bb = Blackboard(BlackboardConfig(fallback_to_memory=True, enabled=True))
|
||||
count = bb.publish("empty:channel", {"msg": 1})
|
||||
assert count == 0
|
||||
|
||||
|
||||
class TestBlackboardConfig:
|
||||
"""Test configuration parsing and validation."""
|
||||
|
||||
def test_default_config(self):
|
||||
cfg = BlackboardConfig()
|
||||
assert cfg.enabled is True
|
||||
assert cfg.redis_url == "redis://localhost:6379/0"
|
||||
assert cfg.keyspace_prefix == "timmy"
|
||||
assert cfg.ttl_seconds == 3600
|
||||
assert cfg.fallback_to_memory is True
|
||||
|
||||
def test_custom_config(self):
|
||||
cfg = BlackboardConfig(
|
||||
enabled=False,
|
||||
redis_url="redis://192.168.1.10:6379/1",
|
||||
keyspace_prefix="myagent",
|
||||
ttl_seconds=1800,
|
||||
fallback_to_memory=False,
|
||||
)
|
||||
assert cfg.enabled is False
|
||||
assert cfg.redis_url == "redis://192.168.1.10:6379/1"
|
||||
assert cfg.keyspace_prefix == "myagent"
|
||||
assert cfg.ttl_seconds == 1800
|
||||
assert cfg.fallback_to_memory is False
|
||||
|
||||
|
||||
class TestKeyspacePrefix:
|
||||
"""Test that keys are correctly prefixed."""
|
||||
|
||||
def test_prefixed_keys(self):
|
||||
bb = Blackboard(BlackboardConfig(keyspace_prefix="myagent", fallback_to_memory=True))
|
||||
bb.set("thought", "test")
|
||||
# Internal key should be "myagent:thought"
|
||||
# We can verify by checking keys()
|
||||
keys = bb.keys("*")
|
||||
assert any("myagent:thought" in k for k in keys)
|
||||
|
||||
|
||||
class TestBlackboardIntegration:
|
||||
"""Integration pattern: agent thought cycle."""
|
||||
|
||||
def test_agent_thought_cycle(self):
|
||||
"""Simulate Timmy writing a thought and Ezra reading it."""
|
||||
bb = Blackboard(BlackboardConfig(fallback_to_memory=True, enabled=True))
|
||||
|
||||
# Agent A writes observation
|
||||
bb.set("agent:timmy:observation", "Gitea queue has 12 open issues")
|
||||
|
||||
# Agent B reads
|
||||
obs = bb.get("agent:timmy:observation")
|
||||
assert obs == "Gitea queue has 12 open issues"
|
||||
|
||||
# Agent B writes analysis
|
||||
bb.set("agent:ezra:analysis", "Prioritize critical bugs first")
|
||||
|
||||
# Event-driven pattern
|
||||
events = []
|
||||
|
||||
def on_plan(channel, message):
|
||||
events.append(message)
|
||||
|
||||
bb.subscribe("fleet:plan", on_plan)
|
||||
bb.publish("fleet:plan", {"phase": "triaging", "lead": "ezra"})
|
||||
|
||||
assert len(events) == 1
|
||||
assert events[0]["phase"] == "triaging"
|
||||
|
||||
|
||||
class TestTTL:
|
||||
"""Test TTL handling (where supported)."""
|
||||
|
||||
def test_ttl_set_in_config(self):
|
||||
cfg = BlackboardConfig(ttl_seconds=60, fallback_to_memory=True)
|
||||
bb = Blackboard(cfg)
|
||||
assert bb.ttl == 60
|
||||
# Setting a value uses TTL from config
|
||||
bb.set("temp:key", "expiring value")
|
||||
# In memory backend ignores TTL, but value is set
|
||||
assert bb.get("temp:key") == "expiring value"
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────
|
||||
# CLI smoke — can be called directly: python -m tests.test_blackboard
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
print("Running Blackboard smoke tests...")
|
||||
|
||||
suite = [
|
||||
TestBlackboardBasics().test_kv_memory_backend,
|
||||
TestBlackboardPubSub().test_pubsub_memory_backend,
|
||||
TestBlackboardConfig().test_default_config,
|
||||
TestBlackboardIntegration().test_agent_thought_cycle,
|
||||
]
|
||||
|
||||
failures = 0
|
||||
for test in suite:
|
||||
name = test.__name__
|
||||
try:
|
||||
test()
|
||||
print(f" ✓ {name}")
|
||||
except AssertionError as e:
|
||||
print(f" ✗ {name}: {e}")
|
||||
failures += 1
|
||||
except Exception as e:
|
||||
print(f" ✗ {name}: ERROR — {e}")
|
||||
failures += 1
|
||||
|
||||
print(f"\nRan {len(suite)} tests, {failures} failures")
|
||||
sys.exit(failures)
|
||||
@@ -8,7 +8,6 @@ ROOT = Path(__file__).resolve().parents[1]
|
||||
PIPELINE_PATH = ROOT / "pipelines" / "codebase_genome.py"
|
||||
NIGHTLY_PATH = ROOT / "scripts" / "codebase_genome_nightly.py"
|
||||
GENOME_PATH = ROOT / "GENOME.md"
|
||||
CRON_PATH = ROOT / "codebase_genome_cron.yml"
|
||||
|
||||
|
||||
def _load_module(path: Path, name: str):
|
||||
@@ -114,17 +113,3 @@ def test_repo_contains_generated_timmy_home_genome() -> None:
|
||||
"## Performance Bottleneck Analysis",
|
||||
):
|
||||
assert snippet in text
|
||||
|
||||
|
||||
def test_repo_contains_nightly_cron_spec_for_genome_rotation() -> None:
|
||||
assert CRON_PATH.exists(), "missing codebase_genome_cron.yml"
|
||||
text = CRON_PATH.read_text(encoding="utf-8")
|
||||
for snippet in (
|
||||
"Codebase Genome Nightly",
|
||||
"scripts/codebase_genome_nightly.py",
|
||||
"--org Timmy_Foundation",
|
||||
"--workspace-root",
|
||||
"--output-root",
|
||||
"--state-path",
|
||||
):
|
||||
assert snippet in text
|
||||
|
||||
Reference in New Issue
Block a user