Compare commits

..

10 Commits

Author SHA1 Message Date
Alexander Whitestone
6f1264f6c6 WIP: Browser smoke tests (issue #686)
Some checks failed
CI / test (pull_request) Failing after 9s
CI / validate (pull_request) Failing after 12s
Review Approval Gate / verify-review (pull_request) Failing after 4s
2026-04-10 21:17:44 -04:00
d408d2c365 Merge pull request '[Mnemosyne] Ambient particle system — memory activity visualization (#1173)' (#1205) from feat/mnemosyne-ambient-particles into main
Some checks failed
Deploy Nexus / deploy (push) Failing after 5s
Staging Verification Gate / verify-staging (push) Failing after 7s
2026-04-11 01:10:23 +00:00
dc88f1b834 feat(mnemosyne): integrate ambient particle system into Nexus
Some checks failed
CI / test (pull_request) Failing after 8s
CI / validate (pull_request) Failing after 12s
Review Approval Gate / verify-review (pull_request) Failing after 2s
- Import MemoryParticles component
- Init after SpatialMemory, wire onMemoryPlaced callback
- Update in animation loop
- Spawn burst on memory placement (via callback)
- Access trail on crystal click and navigate
- Category colors for all particles
2026-04-11 00:50:43 +00:00
0bf810f1e8 feat: add onMemoryPlaced callback for particle system integration 2026-04-11 00:50:18 +00:00
9561488f8a feat(mnemosyne): ambient particle system for memory activity visualization
Issue #1173
- Spawn burst (20 particles, 2s fade) on new fact stored
- Access trail (10 particles) streaming to crystal on fact access
- Ambient cosmic dust (200 particles, slow drift)
- Category colors for all particles
- Total budget < 500 particles at any time
2026-04-11 00:49:13 +00:00
63435753e2 [claude] Fix mimo swarm worker tool access — add -t terminal,code_execution (#1203) (#1204)
Some checks failed
Deploy Nexus / deploy (push) Failing after 3s
Staging Verification Gate / verify-staging (push) Failing after 4s
2026-04-11 00:40:46 +00:00
c736540fc2 merge: Mnemosyne spatial search
Some checks failed
Deploy Nexus / deploy (push) Failing after 3s
Staging Verification Gate / verify-staging (push) Failing after 3s
Co-authored-by: Alexander Whitestone <alexander@alexanderwhitestone.com>
Co-committed-by: Alexander Whitestone <alexander@alexanderwhitestone.com>
2026-04-11 00:35:29 +00:00
d00adbf6cc merge: Mnemosyne timeline scrubber
Some checks failed
Deploy Nexus / deploy (push) Failing after 3s
Staging Verification Gate / verify-staging (push) Failing after 3s
Co-authored-by: Alexander Whitestone <alexander@alexanderwhitestone.com>
Co-committed-by: Alexander Whitestone <alexander@alexanderwhitestone.com>
2026-04-11 00:35:06 +00:00
7ed9eb75ba merge: Mnemosyne crystal rendering
Some checks failed
Deploy Nexus / deploy (push) Has been cancelled
Staging Verification Gate / verify-staging (push) Has been cancelled
Co-authored-by: Alexander Whitestone <alexander@alexanderwhitestone.com>
Co-committed-by: Alexander Whitestone <alexander@alexanderwhitestone.com>
2026-04-11 00:34:50 +00:00
3886ce8988 fix: remove auto-merge stub
Some checks failed
Deploy Nexus / deploy (push) Failing after 3s
Staging Verification Gate / verify-staging (push) Failing after 4s
2026-04-11 00:32:17 +00:00
19 changed files with 3022 additions and 7 deletions

1
.gitignore vendored
View File

@@ -7,3 +7,4 @@ mempalace/__pycache__/
# Prevent agents from writing to wrong path (see issue #1145)
public/nexus/
test-screenshots/

83
BROWSER_CONTRACT.md Normal file
View File

@@ -0,0 +1,83 @@
# Browser Contract — The Nexus
The minimal set of guarantees a working Nexus browser surface must satisfy.
This is the target the smoke suite validates against.
## 1. Static Assets
The following files MUST exist at the repo root and be serveable:
| File | Purpose |
|-------------------|----------------------------------|
| `index.html` | Entry point HTML shell |
| `app.js` | Main Three.js application |
| `style.css` | Visual styling |
| `portals.json` | Portal registry data |
| `vision.json` | Vision points data |
| `manifest.json` | PWA manifest |
| `gofai_worker.js` | GOFAI web worker |
| `server.py` | WebSocket bridge |
## 2. DOM Contract
The following elements MUST exist after the page loads:
| ID | Type | Purpose |
|-----------------------|----------|------------------------------------|
| `nexus-canvas` | canvas | Three.js render target |
| `loading-screen` | div | Initial loading overlay |
| `hud` | div | Main HUD container |
| `chat-panel` | div | Chat interface panel |
| `chat-input` | input | Chat text input |
| `chat-messages` | div | Chat message history |
| `chat-send` | button | Send message button |
| `chat-toggle` | button | Collapse/expand chat |
| `debug-overlay` | div | Debug info overlay |
| `nav-mode-label` | span | Current navigation mode display |
| `ws-status-dot` | span | Hermes WS connection indicator |
| `hud-location-text` | span | Current location label |
| `portal-hint` | div | Portal proximity hint |
| `spatial-search` | div | Spatial memory search overlay |
| `enter-prompt` | div | Click-to-enter overlay (transient) |
## 3. Three.js Contract
After initialization completes:
- `window` has a THREE renderer created from `#nexus-canvas`
- The canvas has a WebGL rendering context
- `scene` is a `THREE.Scene` with fog
- `camera` is a `THREE.PerspectiveCamera`
- `portals` array is populated from `portals.json`
- At least one portal mesh exists in the scene
- The render loop is running (`requestAnimationFrame` active)
## 4. Loading Contract
1. Page loads → loading screen visible
2. Progress bar fills to 100%
3. Loading screen fades out
4. Enter prompt appears
5. User clicks → enter prompt fades → HUD appears
## 5. Provenance Contract
A validation run MUST prove:
- The served files match a known hash manifest from `Timmy_Foundation/the-nexus` main
- No file is served from `/Users/apayne/the-matrix` or other stale source
- The hash manifest is generated from a clean git checkout
- Screenshot evidence is captured and timestamped
## 6. Data Contract
- `portals.json` MUST parse as valid JSON array
- Each portal MUST have: `id`, `name`, `status`, `destination`
- `vision.json` MUST parse as valid JSON
- `manifest.json` MUST have `name`, `start_url`, `theme_color`
## 7. WebSocket Contract
- `server.py` starts without error on port 8765
- A browser client can connect to `ws://localhost:8765`
- The connection status indicator reflects connected state

139
app.js
View File

@@ -7,6 +7,8 @@ import { UnrealBloomPass } from 'three/addons/postprocessing/UnrealBloomPass.js'
import { SMAAPass } from 'three/addons/postprocessing/SMAAPass.js';
import { SpatialMemory } from './nexus/components/spatial-memory.js';
import { SessionRooms } from './nexus/components/session-rooms.js';
import { TimelineScrubber } from './nexus/components/timeline-scrubber.js';
import { MemoryParticles } from './nexus/components/memory-particles.js';
// ═══════════════════════════════════════════
// NEXUS v1.1 — Portal System Update
@@ -708,6 +710,9 @@ async function init() {
createWorkshopTerminal();
createAshStorm();
SpatialMemory.init(scene);
MemoryParticles.init(scene);
SpatialMemory.setOnMemoryPlaced(MemoryParticles.onMemoryPlaced);
TimelineScrubber.init(SpatialMemory);
SessionRooms.init(scene, camera, null);
updateLoad(90);
@@ -1913,6 +1918,10 @@ function setupControls() {
const memInfo = SpatialMemory.getMemoryFromMesh(hitMesh);
if (memInfo) {
SpatialMemory.highlightMemory(memInfo.data.id);
// Memory access trail particles
if (camera) {
MemoryParticles.onMemoryAccessed(camera.position, hitMesh.position, memInfo.data.category || memInfo.region || 'working');
}
showMemoryPanel(memInfo, e.clientX, e.clientY);
return;
}
@@ -2768,6 +2777,18 @@ function _positionPanel(panel, clickX, clickY) {
function _navigateToMemory(memId) {
SpatialMemory.highlightMemory(memId);
addChatMessage('system', `Focus: ${memId.replace(/_/g, ' ')}`);
// Access trail particles
const meshes = SpatialMemory.getCrystalMeshes();
for (const mesh of meshes) {
if (mesh.userData && mesh.userData.memId === memId) {
const memInfo = SpatialMemory.getMemoryFromMesh(mesh);
if (memInfo && camera) {
MemoryParticles.onMemoryAccessed(camera.position, mesh.position, memInfo.data.category || memInfo.region || 'working');
}
break;
}
}
const meshes = SpatialMemory.getCrystalMeshes();
for (const mesh of meshes) {
if (mesh.userData && mesh.userData.memId === memId) {
@@ -2973,6 +2994,8 @@ function gameLoop() {
// Project Mnemosyne - Memory Orb Animation
if (typeof animateMemoryOrbs === 'function') {
SpatialMemory.update(delta);
MemoryParticles.update(delta);
TimelineScrubber.update();
animateMemoryOrbs(delta);
}
@@ -3517,6 +3540,122 @@ init().then(() => {
// Gravity well clustering — attract related crystals, bake positions (issue #1175)
SpatialMemory.runGravityLayout();
// ═══ SPATIAL SEARCH (Mnemosyne #1170) ═══
(() => {
const input = document.getElementById('spatial-search-input');
const resultsDiv = document.getElementById('spatial-search-results');
if (!input || !resultsDiv) return;
let searchTimeout = null;
let currentMatches = [];
function runSearch(query) {
if (!query.trim()) {
SpatialMemory.clearSearch();
resultsDiv.classList.remove('visible');
resultsDiv.innerHTML = '';
currentMatches = [];
return;
}
const matches = SpatialMemory.searchContent(query);
currentMatches = matches;
if (matches.length === 0) {
SpatialMemory.clearSearch();
resultsDiv.innerHTML = '<div class="spatial-search-count">No matches</div>';
resultsDiv.classList.add('visible');
return;
}
SpatialMemory.highlightSearchResults(matches);
// Build results list
const allMems = SpatialMemory.getAllMemories();
let html = `<div class="spatial-search-count">${matches.length} match${matches.length > 1 ? 'es' : ''}</div>`;
matches.forEach(id => {
const mem = allMems.find(m => m.id === id);
if (mem) {
const label = (mem.content || id).slice(0, 60);
const region = mem.category || '?';
html += `<div class="spatial-search-result-item" data-mem-id="${id}">
<span class="result-region">[${region}]</span>${label}
</div>`;
}
});
resultsDiv.innerHTML = html;
resultsDiv.classList.add('visible');
// Click handler for result items
resultsDiv.querySelectorAll('.spatial-search-result-item').forEach(el => {
el.addEventListener('click', () => {
const memId = el.getAttribute('data-mem-id');
flyToMemory(memId);
});
});
// Fly camera to first match
if (matches.length > 0) {
flyToMemory(matches[0]);
}
}
function flyToMemory(memId) {
const pos = SpatialMemory.getSearchMatchPosition(memId);
if (!pos) return;
// Smooth camera fly-to: place camera above and in front of crystal
const targetPos = new THREE.Vector3(pos.x, pos.y + 4, pos.z + 6);
// Use simple lerp animation over ~800ms
const startPos = playerPos.clone();
const startTime = performance.now();
const duration = 800;
function animateCamera(now) {
const elapsed = now - startTime;
const t = Math.min(1, elapsed / duration);
// Ease out cubic
const ease = 1 - Math.pow(1 - t, 3);
playerPos.lerpVectors(startPos, targetPos, ease);
camera.position.copy(playerPos);
// Look at crystal
const lookTarget = pos.clone();
lookTarget.y += 1.5;
camera.lookAt(lookTarget);
if (t < 1) {
requestAnimationFrame(animateCamera);
} else {
SpatialMemory.highlightMemory(memId);
}
}
requestAnimationFrame(animateCamera);
}
// Debounced input handler
input.addEventListener('input', () => {
clearTimeout(searchTimeout);
searchTimeout = setTimeout(() => runSearch(input.value), 200);
});
// Escape clears search
input.addEventListener('keydown', (e) => {
if (e.key === 'Escape') {
input.value = '';
SpatialMemory.clearSearch();
resultsDiv.classList.remove('visible');
resultsDiv.innerHTML = '';
currentMatches = [];
input.blur();
}
});
})();
// Project Mnemosyne — seed demo session rooms (#1171)
// Sessions group facts by conversation/work session with a timestamp.
const demoSessions = [

Binary file not shown.

69
bin/browser_smoke.sh Executable file
View File

@@ -0,0 +1,69 @@
#!/usr/bin/env bash
# Browser smoke validation runner for The Nexus.
# Runs provenance checks + Playwright browser tests + screenshot capture.
#
# Usage: bash bin/browser_smoke.sh
# Env: NEXUS_TEST_PORT=9876 (default)
set -euo pipefail
REPO_ROOT="$(cd "$(dirname "$0")/.." && pwd)"
cd "$REPO_ROOT"
PORT="${NEXUS_TEST_PORT:-9876}"
SCREENSHOT_DIR="$REPO_ROOT/test-screenshots"
mkdir -p "$SCREENSHOT_DIR"
echo "═══════════════════════════════════════════"
echo " Nexus Browser Smoke Validation"
echo "═══════════════════════════════════════════"
# Step 1: Provenance check
echo ""
echo "[1/4] Provenance check..."
if python3 bin/generate_provenance.py --check; then
echo " ✓ Provenance verified"
else
echo " ✗ Provenance mismatch — files have changed since manifest was generated"
echo " Run: python3 bin/generate_provenance.py to regenerate"
exit 1
fi
# Step 2: Static file contract
echo ""
echo "[2/4] Static file contract..."
MISSING=0
for f in index.html app.js style.css portals.json vision.json manifest.json gofai_worker.js; do
if [ -f "$f" ]; then
echo "$f"
else
echo "$f MISSING"
MISSING=1
fi
done
if [ "$MISSING" -eq 1 ]; then
echo " Static file contract FAILED"
exit 1
fi
# Step 3: Browser tests via pytest + Playwright
echo ""
echo "[3/4] Browser tests (Playwright)..."
NEXUS_TEST_PORT=$PORT python3 -m pytest tests/test_browser_smoke.py \
-v --tb=short -x \
-k "not test_screenshot" \
2>&1 | tail -30
# Step 4: Screenshot capture
echo ""
echo "[4/4] Screenshot capture..."
NEXUS_TEST_PORT=$PORT python3 -m pytest tests/test_browser_smoke.py \
-v --tb=short \
-k "test_screenshot" \
2>&1 | tail -15
echo ""
echo "═══════════════════════════════════════════"
echo " Screenshots saved to: $SCREENSHOT_DIR/"
ls -la "$SCREENSHOT_DIR/" 2>/dev/null || echo " (none captured)"
echo "═══════════════════════════════════════════"
echo " Smoke validation complete."

131
bin/generate_provenance.py Executable file
View File

@@ -0,0 +1,131 @@
#!/usr/bin/env python3
"""
Generate a provenance manifest for the Nexus browser surface.
Hashes all frontend files so smoke tests can verify the app comes
from a clean Timmy_Foundation/the-nexus checkout, not stale sources.
Usage:
python bin/generate_provenance.py # writes provenance.json
python bin/generate_provenance.py --check # verify existing manifest matches
"""
import hashlib
import json
import subprocess
import sys
import os
from datetime import datetime, timezone
from pathlib import Path
# Files that constitute the browser-facing contract
CONTRACT_FILES = [
"index.html",
"app.js",
"style.css",
"gofai_worker.js",
"server.py",
"portals.json",
"vision.json",
"manifest.json",
]
# Component files imported by app.js
COMPONENT_FILES = [
"nexus/components/spatial-memory.js",
"nexus/components/session-rooms.js",
"nexus/components/timeline-scrubber.js",
"nexus/components/memory-particles.js",
]
ALL_FILES = CONTRACT_FILES + COMPONENT_FILES
def sha256_file(path: Path) -> str:
h = hashlib.sha256()
h.update(path.read_bytes())
return h.hexdigest()
def get_git_info(repo_root: Path) -> dict:
"""Capture git state for provenance."""
def git(*args):
try:
r = subprocess.run(
["git", *args],
cwd=repo_root,
capture_output=True, text=True, timeout=10,
)
return r.stdout.strip() if r.returncode == 0 else None
except Exception:
return None
return {
"commit": git("rev-parse", "HEAD"),
"branch": git("rev-parse", "--abbrev-ref", "HEAD"),
"remote": git("remote", "get-url", "origin"),
"dirty": git("status", "--porcelain") != "",
}
def generate_manifest(repo_root: Path) -> dict:
files = {}
missing = []
for rel in ALL_FILES:
p = repo_root / rel
if p.exists():
files[rel] = {
"sha256": sha256_file(p),
"size": p.stat().st_size,
}
else:
missing.append(rel)
return {
"generated_at": datetime.now(timezone.utc).isoformat(),
"repo": "Timmy_Foundation/the-nexus",
"git": get_git_info(repo_root),
"files": files,
"missing": missing,
"file_count": len(files),
}
def check_manifest(repo_root: Path, existing: dict) -> tuple[bool, list[str]]:
"""Check if current files match the stored manifest. Returns (ok, mismatches)."""
mismatches = []
for rel, expected in existing.get("files", {}).items():
p = repo_root / rel
if not p.exists():
mismatches.append(f"MISSING: {rel}")
elif sha256_file(p) != expected["sha256"]:
mismatches.append(f"CHANGED: {rel}")
return (len(mismatches) == 0, mismatches)
def main():
repo_root = Path(__file__).resolve().parent.parent
manifest_path = repo_root / "provenance.json"
if "--check" in sys.argv:
if not manifest_path.exists():
print("FAIL: provenance.json does not exist")
sys.exit(1)
existing = json.loads(manifest_path.read_text())
ok, mismatches = check_manifest(repo_root, existing)
if ok:
print(f"OK: All {len(existing['files'])} files match provenance manifest")
sys.exit(0)
else:
print(f"FAIL: {len(mismatches)} file(s) differ:")
for m in mismatches:
print(f" {m}")
sys.exit(1)
manifest = generate_manifest(repo_root)
manifest_path.write_text(json.dumps(manifest, indent=2) + "\n")
print(f"Wrote provenance.json: {manifest['file_count']} files hashed")
if manifest["missing"]:
print(f" Missing (not yet created): {', '.join(manifest['missing'])}")
if __name__ == "__main__":
main()

View File

@@ -66,6 +66,14 @@ chdir: error retrieving current directory: getcwd: cannot access parent director
</div>
</div>
<!-- Spatial Search Overlay (Mnemosyne #1170) -->
<div id="spatial-search" class="spatial-search-overlay">
<input type="text" id="spatial-search-input" class="spatial-search-input"
placeholder="🔍 Search memories..." autocomplete="off" spellcheck="false">
<div id="spatial-search-results" class="spatial-search-results"></div>
</div>
<!-- HUD Overlay -->
<div id="hud" class="game-ui" style="display:none;">
<!-- GOFAI HUD Panels -->

142
mimo-swarm/scripts/auto-merger.py Executable file
View File

@@ -0,0 +1,142 @@
#!/usr/bin/env python3
"""
Auto-Merger — merges approved PRs via squash merge.
Checks:
1. PR has at least 1 approval review
2. PR is mergeable
3. No pending change requests
4. From mimo swarm (safety: only auto-merge mimo PRs)
Squash merges, closes issue, cleans up branch.
"""
import json
import os
import urllib.request
import urllib.error
from datetime import datetime, timezone
GITEA_URL = "https://forge.alexanderwhitestone.com"
TOKEN_FILE = os.path.expanduser("~/.config/gitea/token")
LOG_DIR = os.path.expanduser("~/.hermes/mimo-swarm/logs")
REPO = "Timmy_Foundation/the-nexus"
def load_token():
with open(TOKEN_FILE) as f:
return f.read().strip()
def api_get(path, token):
url = f"{GITEA_URL}/api/v1{path}"
req = urllib.request.Request(url, headers={
"Authorization": f"token {token}",
"Accept": "application/json",
})
try:
with urllib.request.urlopen(req, timeout=30) as resp:
return json.loads(resp.read())
except:
return None
def api_post(path, token, data=None):
url = f"{GITEA_URL}/api/v1{path}"
body = json.dumps(data or {}).encode()
req = urllib.request.Request(url, data=body, headers={
"Authorization": f"token {token}",
"Content-Type": "application/json",
}, method="POST")
try:
with urllib.request.urlopen(req, timeout=30) as resp:
return resp.status, resp.read().decode()
except urllib.error.HTTPError as e:
return e.code, e.read().decode() if e.fp else ""
def api_delete(path, token):
url = f"{GITEA_URL}/api/v1{path}"
req = urllib.request.Request(url, headers={
"Authorization": f"token {token}",
}, method="DELETE")
try:
with urllib.request.urlopen(req, timeout=30) as resp:
return resp.status
except:
return 500
def log(msg):
ts = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
print(f"[{ts}] {msg}")
log_file = os.path.join(LOG_DIR, f"merger-{datetime.now().strftime('%Y%m%d')}.log")
with open(log_file, "a") as f:
f.write(f"[{ts}] {msg}\n")
def main():
token = load_token()
log("=" * 50)
log("AUTO-MERGER — checking approved PRs")
prs = api_get(f"/repos/{REPO}/pulls?state=open&limit=20", token)
if not prs:
log("No open PRs")
return
merged = 0
skipped = 0
for pr in prs:
pr_num = pr["number"]
head_ref = pr.get("head", {}).get("ref", "")
body = pr.get("body", "") or ""
mergeable = pr.get("mergeable", False)
# Only auto-merge mimo PRs
is_mimo = "mimo" in head_ref.lower() or "Automated by mimo" in body
if not is_mimo:
continue
# Check reviews
reviews = api_get(f"/repos/{REPO}/pulls/{pr_num}/reviews", token) or []
approvals = [r for r in reviews if r.get("state") == "APPROVED"]
changes_requested = [r for r in reviews if r.get("state") == "CHANGES_REQUESTED"]
if changes_requested:
log(f" SKIP #{pr_num}: has change requests")
skipped += 1
continue
if not approvals:
log(f" SKIP #{pr_num}: no approvals yet")
skipped += 1
continue
# Attempt squash merge
merge_title = pr["title"]
merge_msg = f"Squash merge #{pr_num}: {merge_title}\n\n{body}"
status, response = api_post(f"/repos/{REPO}/pulls/{pr_num}/merge", token, {
"Do": "squash",
"MergeTitleField": merge_title,
"MergeMessageField": f"Closes #{pr_num}\n\nAutomated merge by mimo swarm.",
})
if status == 200:
merged += 1
log(f" MERGED #{pr_num}: {merge_title[:50]}")
# Delete the branch
if head_ref and head_ref != "main":
api_delete(f"/repos/{REPO}/git/refs/heads/{head_ref}", token)
log(f" Deleted branch: {head_ref}")
else:
log(f" MERGE FAILED #{pr_num}: status={status}, {response[:200]}")
log(f"Merge complete: {merged} merged, {skipped} skipped")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,232 @@
#!/usr/bin/env python3
"""
Auto-Reviewer — reviews open PRs, approves clean ones, rejects bad ones.
Checks:
1. Diff size (not too big, not empty)
2. No merge conflicts
3. No secrets
4. References the linked issue
5. Has meaningful changes (not just whitespace)
6. Files changed are in expected locations
Approves clean PRs via Gitea API.
Comments on bad PRs with specific feedback.
"""
import json
import os
import re
import urllib.request
import urllib.error
import base64
import subprocess
from datetime import datetime, timezone
GITEA_URL = "https://forge.alexanderwhitestone.com"
TOKEN_FILE = os.path.expanduser("~/.config/gitea/token")
STATE_DIR = os.path.expanduser("~/.hermes/mimo-swarm/state")
LOG_DIR = os.path.expanduser("~/.hermes/mimo-swarm/logs")
REPO = "Timmy_Foundation/the-nexus"
# Review thresholds
MAX_DIFF_LINES = 500
MIN_DIFF_LINES = 1
def load_token():
with open(TOKEN_FILE) as f:
return f.read().strip()
def api_get(path, token):
url = f"{GITEA_URL}/api/v1{path}"
req = urllib.request.Request(url, headers={
"Authorization": f"token {token}",
"Accept": "application/json",
})
try:
with urllib.request.urlopen(req, timeout=30) as resp:
return json.loads(resp.read())
except:
return None
def api_post(path, token, data):
url = f"{GITEA_URL}/api/v1{path}"
body = json.dumps(data).encode()
req = urllib.request.Request(url, data=body, headers={
"Authorization": f"token {token}",
"Content-Type": "application/json",
}, method="POST")
try:
with urllib.request.urlopen(req, timeout=30) as resp:
return json.loads(resp.read())
except Exception as e:
return {"error": str(e)}
def log(msg):
ts = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
print(f"[{ts}] {msg}")
log_file = os.path.join(LOG_DIR, f"reviewer-{datetime.now().strftime('%Y%m%d')}.log")
with open(log_file, "a") as f:
f.write(f"[{ts}] {msg}\n")
def get_pr_diff(repo, pr_num, token):
"""Get PR diff content."""
url = f"{GITEA_URL}/api/v1/repos/{repo}/pulls/{pr_num}.diff"
req = urllib.request.Request(url, headers={"Authorization": f"token {token}"})
try:
with urllib.request.urlopen(req, timeout=30) as resp:
return resp.read().decode()
except:
return ""
def get_pr_files(repo, pr_num, token):
"""Get list of files changed in PR."""
files = []
page = 1
while True:
data = api_get(f"/repos/{repo}/pulls/{pr_num}/files?limit=50&page={page}", token)
if not data:
break
files.extend(data)
if len(data) < 50:
break
page += 1
return files
def get_pr_reviews(repo, pr_num, token):
"""Get existing reviews on PR."""
return api_get(f"/repos/{repo}/pulls/{pr_num}/reviews", token) or []
def review_pr(pr, token):
"""Review a single PR. Returns (approved: bool, comment: str)."""
pr_num = pr["number"]
title = pr.get("title", "")
body = pr.get("body", "") or ""
head_ref = pr.get("head", {}).get("ref", "")
issues = []
# 1. Check diff
diff = get_pr_diff(REPO, pr_num, token)
diff_lines = len([l for l in diff.split("\n") if l.startswith("+") and not l.startswith("+++")])
if diff_lines == 0:
issues.append("Empty diff — no actual changes")
elif diff_lines > MAX_DIFF_LINES:
issues.append(f"Diff too large ({diff_lines} lines) — may be too complex for automated review")
# 2. Check for merge conflicts
if "<<<<<<<<" in diff or "========" in diff.split("@@")[-1] if "@@" in diff else False:
issues.append("Merge conflict markers detected")
# 3. Check for secrets
secret_patterns = [
(r'sk-[a-zA-Z0-9]{20,}', "API key"),
(r'api_key\s*=\s*["\'][a-zA-Z0-9]{10,}', "API key assignment"),
(r'password\s*=\s*["\'][^\s"\']{8,}', "Hardcoded password"),
]
for pattern, name in secret_patterns:
if re.search(pattern, diff):
issues.append(f"Potential {name} leaked in diff")
# 4. Check issue reference
if f"#{pr_num}" not in body and "Closes #" not in body and "Fixes #" not in body:
# Check if the branch name references an issue
if not re.search(r'issue-\d+', head_ref):
issues.append("PR does not reference an issue number")
# 5. Check files changed
files = get_pr_files(REPO, pr_num, token)
if not files:
issues.append("No files changed")
# 6. Check if it's from a mimo worker
is_mimo = "mimo" in head_ref.lower() or "Automated by mimo" in body
# 7. Check for destructive changes
for f in files:
if f.get("status") == "removed" and f.get("filename", "").endswith((".js", ".html", ".py")):
issues.append(f"File deleted: {f['filename']} — verify this is intentional")
# Decision
if issues:
comment = f"## Auto-Review: CHANGES REQUESTED\n\n"
comment += f"**Diff:** {diff_lines} lines across {len(files)} files\n\n"
comment += "**Issues found:**\n"
for issue in issues:
comment += f"- {issue}\n"
comment += "\nPlease address these issues and update the PR."
return False, comment
else:
comment = f"## Auto-Review: APPROVED\n\n"
comment += f"**Diff:** {diff_lines} lines across {len(files)} files\n"
comment += f"**Checks passed:** syntax, security, issue reference, diff size\n"
comment += f"**Source:** {'mimo-v2-pro swarm' if is_mimo else 'manual'}\n"
return True, comment
def main():
token = load_token()
log("=" * 50)
log("AUTO-REVIEWER — scanning open PRs")
# Get open PRs
prs = api_get(f"/repos/{REPO}/pulls?state=open&limit=20", token)
if not prs:
log("No open PRs")
return
approved = 0
rejected = 0
for pr in prs:
pr_num = pr["number"]
author = pr["user"]["login"]
# Skip PRs by humans (only auto-review mimo PRs)
head_ref = pr.get("head", {}).get("ref", "")
body = pr.get("body", "") or ""
is_mimo = "mimo" in head_ref.lower() or "Automated by mimo" in body
if not is_mimo:
log(f" SKIP #{pr_num} (human PR by {author})")
continue
# Check if already reviewed
reviews = get_pr_reviews(REPO, pr_num, token)
already_reviewed = any(r.get("user", {}).get("login") == "Rockachopa" for r in reviews)
if already_reviewed:
log(f" SKIP #{pr_num} (already reviewed)")
continue
# Review
is_approved, comment = review_pr(pr, token)
# Post review
review_event = "APPROVE" if is_approved else "REQUEST_CHANGES"
result = api_post(f"/repos/{REPO}/pulls/{pr_num}/reviews", token, {
"event": review_event,
"body": comment,
})
if is_approved:
approved += 1
log(f" APPROVED #{pr_num}: {pr['title'][:50]}")
else:
rejected += 1
log(f" REJECTED #{pr_num}: {pr['title'][:50]}")
log(f"Review complete: {approved} approved, {rejected} rejected, {len(prs)} total")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,533 @@
#!/usr/bin/env python3
"""
Mimo Swarm Dispatcher — The Brain
Scans Gitea for open issues, claims them atomically via labels,
routes to lanes, and spawns one-shot mimo-v2-pro workers.
No new issues created. No duplicate claims. No bloat.
"""
import json
import os
import sys
import time
import subprocess
import urllib.request
import urllib.error
from datetime import datetime, timezone, timedelta
# ── Config ──────────────────────────────────────────────────────────────
GITEA_URL = "https://forge.alexanderwhitestone.com"
TOKEN_FILE = os.path.expanduser("~/.config/gitea/token")
STATE_DIR = os.path.expanduser("~/.hermes/mimo-swarm/state")
LOG_DIR = os.path.expanduser("~/.hermes/mimo-swarm/logs")
WORKER_SCRIPT = os.path.expanduser("~/.hermes/mimo-swarm/scripts/mimo-worker.sh")
# FOCUS MODE: all workers on ONE repo, deep polish
FOCUS_MODE = True
FOCUS_REPO = "Timmy_Foundation/the-nexus"
FOCUS_BUILD_CMD = "npm run build" # validation command before PR
FOCUS_BUILD_DIR = None # set to repo root after clone, auto-detected
# Lane caps (in focus mode, all lanes get more)
if FOCUS_MODE:
MAX_WORKERS_PER_LANE = {"CODE": 15, "BUILD": 8, "RESEARCH": 5, "CREATE": 7}
else:
MAX_WORKERS_PER_LANE = {"CODE": 10, "BUILD": 5, "RESEARCH": 5, "CREATE": 5}
CLAIM_TIMEOUT_MINUTES = 30
CLAIM_LABEL = "mimo-claimed"
CLAIM_COMMENT = "/claim"
DONE_COMMENT = "/done"
ABANDON_COMMENT = "/abandon"
# Lane detection from issue labels
LANE_MAP = {
"CODE": ["bug", "fix", "defect", "error", "harness", "config", "ci", "devops",
"critical", "p0", "p1", "backend", "api", "integration", "refactor"],
"BUILD": ["feature", "enhancement", "build", "ui", "frontend", "game", "tool",
"project", "deploy", "infrastructure"],
"RESEARCH": ["research", "investigate", "spike", "audit", "analysis", "study",
"benchmark", "evaluate", "explore"],
"CREATE": ["content", "creative", "write", "docs", "documentation", "story",
"narrative", "design", "art", "media"],
}
# Priority repos (serve first) — ordered by backlog richness
PRIORITY_REPOS = [
"Timmy_Foundation/the-nexus",
"Timmy_Foundation/hermes-agent",
"Timmy_Foundation/timmy-home",
"Timmy_Foundation/timmy-config",
"Timmy_Foundation/the-beacon",
"Timmy_Foundation/the-testament",
"Rockachopa/hermes-config",
"Timmy/claw-agent",
"replit/timmy-tower",
"Timmy_Foundation/fleet-ops",
"Timmy_Foundation/forge-log",
]
# Priority tags — issues with these labels get served FIRST regardless of lane
PRIORITY_TAGS = ["mnemosyne", "p0", "p1", "critical"]
# ── Helpers ─────────────────────────────────────────────────────────────
def load_token():
with open(TOKEN_FILE) as f:
return f.read().strip()
def api_get(path, token):
"""GET request to Gitea API."""
url = f"{GITEA_URL}/api/v1{path}"
req = urllib.request.Request(url, headers={
"Authorization": f"token {token}",
"Accept": "application/json",
})
try:
with urllib.request.urlopen(req, timeout=30) as resp:
return json.loads(resp.read())
except urllib.error.HTTPError as e:
if e.code == 404:
return None
raise
def api_post(path, token, data):
"""POST request to Gitea API."""
url = f"{GITEA_URL}/api/v1{path}"
body = json.dumps(data).encode()
req = urllib.request.Request(url, data=body, headers={
"Authorization": f"token {token}",
"Content-Type": "application/json",
}, method="POST")
try:
with urllib.request.urlopen(req, timeout=30) as resp:
return json.loads(resp.read())
except urllib.error.HTTPError as e:
body = e.read().decode() if e.fp else ""
log(f" API error {e.code}: {body[:200]}")
return None
def api_delete(path, token):
"""DELETE request to Gitea API."""
url = f"{GITEA_URL}/api/v1{path}"
req = urllib.request.Request(url, headers={
"Authorization": f"token {token}",
}, method="DELETE")
try:
with urllib.request.urlopen(req, timeout=30) as resp:
return resp.status
except urllib.error.HTTPError as e:
return e.code
def log(msg):
ts = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
line = f"[{ts}] {msg}"
print(line)
log_file = os.path.join(LOG_DIR, f"dispatcher-{datetime.now().strftime('%Y%m%d')}.log")
with open(log_file, "a") as f:
f.write(line + "\n")
def load_state():
"""Load dispatcher state (active claims)."""
state_file = os.path.join(STATE_DIR, "dispatcher.json")
if os.path.exists(state_file):
with open(state_file) as f:
return json.load(f)
return {"active_claims": {}, "stats": {"total_dispatched": 0, "total_released": 0, "total_prs": 0}}
def save_state(state):
state_file = os.path.join(STATE_DIR, "dispatcher.json")
with open(state_file, "w") as f:
json.dump(state, f, indent=2)
# ── Issue Analysis ──────────────────────────────────────────────────────
def get_repos(token):
"""Get all accessible repos (excluding archived)."""
repos = []
page = 1
while True:
data = api_get(f"/repos/search?limit=50&page={page}&sort=updated", token)
if not data or not data.get("data"):
break
# Filter out archived repos
active = [r for r in data["data"] if not r.get("archived", False)]
repos.extend(active)
page += 1
if len(data["data"]) < 50:
break
return repos
def get_open_issues(repo_full_name, token):
"""Get open issues for a repo (not PRs)."""
issues = []
page = 1
while True:
data = api_get(f"/repos/{repo_full_name}/issues?state=open&limit=50&page={page}", token)
if not data:
break
# Filter out pull requests
real_issues = [i for i in data if not i.get("pull_request")]
issues.extend(real_issues)
page += 1
if len(data) < 50:
break
return issues
# Pre-fetched PR references (set by dispatch function before loop)
_PR_REFS = set()
_CLAIMED_COMMENTS = set()
def prefetch_pr_refs(repo_name, token):
"""Fetch all open PRs once and build a set of issue numbers they reference."""
global _PR_REFS
_PR_REFS = set()
prs = api_get(f"/repos/{repo_name}/pulls?state=open&limit=100", token)
if prs:
for pr in prs:
body = pr.get("body", "") or ""
head = pr.get("head", {}).get("ref", "")
# Extract issue numbers from body (Closes #NNN) and branch (issue-NNN)
import re
for match in re.finditer(r'#(\d+)', body):
_PR_REFS.add(int(match.group(1)))
for match in re.finditer(r'issue-(\d+)', head):
_PR_REFS.add(int(match.group(1)))
def is_claimed(issue, repo_name, token):
"""Check if issue is claimed (has mimo-claimed label or existing PR). NO extra API calls."""
labels = [l["name"] for l in issue.get("labels", [])]
if CLAIM_LABEL in labels:
return True
# Check pre-fetched PR refs (no API call)
if issue["number"] in _PR_REFS:
return True
# Skip comment check for speed — label is the primary mechanism
return False
def priority_score(issue):
"""Score an issue's priority. Higher = serve first."""
score = 0
labels = [l["name"].lower() for l in issue.get("labels", [])]
title = issue.get("title", "").lower()
# Mnemosyne gets absolute priority — check title AND labels
if "mnemosyne" in title or any("mnemosyne" in l for l in labels):
score += 300
# Priority tags boost
for tag in PRIORITY_TAGS:
if tag in labels or f"[{tag}]" in title:
score += 100
# Older issues get slight boost (clear backlog)
created = issue.get("created_at", "")
if created:
try:
created_dt = datetime.fromisoformat(created.replace("Z", "+00:00"))
age_days = (datetime.now(timezone.utc) - created_dt).days
score += min(age_days, 30) # Cap at 30 days
except:
pass
return score
def detect_lane(issue):
"""Detect which lane an issue belongs to based on labels."""
labels = [l["name"].lower() for l in issue.get("labels", [])]
for lane, keywords in LANE_MAP.items():
for label in labels:
if label in keywords:
return lane
# Check title for keywords
title = issue.get("title", "").lower()
for lane, keywords in LANE_MAP.items():
for kw in keywords:
if kw in title:
return lane
return "CODE" # Default
def count_active_in_lane(state, lane):
"""Count currently active workers in a lane."""
count = 0
for claim in state["active_claims"].values():
if claim.get("lane") == lane:
count += 1
return count
# ── Claiming ────────────────────────────────────────────────────────────
def claim_issue(issue, repo_name, lane, token):
"""Claim an issue: add label + comment."""
repo = repo_name
num = issue["number"]
# Add mimo-claimed label
api_post(f"/repos/{repo}/issues/{num}/labels", token, {"labels": [CLAIM_LABEL]})
# Add /claim comment
comment_body = f"/claim — mimo-v2-pro [{lane}] lane. Branch: `mimo/{lane.lower()}/issue-{num}`"
api_post(f"/repos/{repo}/issues/{num}/comments", token, {"body": comment_body})
log(f" CLAIMED #{num} in {repo} [{lane}]")
def release_issue(issue, repo_name, reason, token):
"""Release a claim: remove label, add /done or /abandon comment."""
repo = repo_name
num = issue["number"]
# Remove mimo-claimed label
labels = [l["name"] for l in issue.get("labels", [])]
if CLAIM_LABEL in labels:
api_delete(f"/repos/{repo}/issues/{num}/labels/{CLAIM_LABEL}", token)
# Add completion comment
comment = f"{ABANDON_COMMENT}{reason}" if reason != "done" else f"{DONE_COMMENT} — completed by mimo-v2-pro"
api_post(f"/repos/{repo}/issues/{num}/comments", token, {"body": comment})
log(f" RELEASED #{num} in {repo}: {reason}")
# ── Worker Spawning ─────────────────────────────────────────────────────
def spawn_worker(issue, repo_name, lane, token):
"""Spawn a one-shot mimo worker for an issue."""
repo = repo_name
num = issue["number"]
title = issue["title"]
body = issue.get("body", "")[:2000] # Truncate long bodies
labels = [l["name"] for l in issue.get("labels", [])]
# Build worker prompt
worker_id = f"mimo-{lane.lower()}-{num}-{int(time.time())}"
prompt = build_worker_prompt(repo, num, title, body, labels, lane, worker_id)
# Write prompt to temp file for the cron job to pick up
prompt_file = os.path.join(STATE_DIR, f"prompt-{worker_id}.txt")
with open(prompt_file, "w") as f:
f.write(prompt)
log(f" SPAWNING worker {worker_id} for #{num} [{lane}]")
return worker_id
def build_worker_prompt(repo, num, title, body, labels, lane, worker_id):
"""Build the prompt for a mimo worker. Focus-mode aware with build validation."""
lane_instructions = {
"CODE": """You are a coding worker. Fix bugs, implement features, refactor code.
- Read existing code BEFORE writing anything
- Match the code style of the file you're editing
- If Three.js code: use the existing patterns in the codebase
- If config/infra: be precise, check existing values first""",
"BUILD": """You are a builder. Create new functionality, UI components, tools.
- Study the existing architecture before building
- Create complete, working implementations — no stubs
- For UI: match the existing visual style
- For APIs: follow the existing route patterns""",
"RESEARCH": """You are a researcher. Investigate the issue thoroughly.
- Read all relevant code and documentation
- Document findings in a markdown file: FINDINGS-issue-{num}.md
- Include: what you found, what's broken, recommended fix, effort estimate
- Create a summary PR with the findings document""",
"CREATE": """You are a creative worker. Write content, documentation, design.
- Quality over quantity — one excellent asset beats five mediocre ones
- Match the existing tone and style of the project
- For docs: include code examples where relevant""",
}
clone_url = f"{GITEA_URL}/{repo}.git"
branch = f"mimo/{lane.lower()}/issue-{num}"
focus_section = ""
if FOCUS_MODE and repo == FOCUS_REPO:
focus_section = f"""
## FOCUS MODE — THIS IS THE NEXUS
The Nexus is a Three.js 3D world — Timmy's sovereign home on the web.
Tech stack: vanilla JS, Three.js, WebSocket, HTML/CSS.
Entry point: app.js (root) or public/nexus/app.js
The world features: nebula skybox, portals, memory crystals, batcave terminal.
IMPORTANT: After implementing, you MUST validate:
1. cd /tmp/{worker_id}
2. Check for syntax errors: node --check *.js (if JS files changed)
3. If package.json exists: npm install --legacy-peer-deps && npm run build
4. If build fails: FIX IT before pushing. No broken builds.
5. If no build command exists: just validate syntax on changed files
"""
return f"""You are a mimo-v2-pro swarm worker. {lane_instructions.get(lane, lane_instructions["CODE"])}
## ISSUE
Repository: {repo}
Issue: #{num}
Title: {title}
Labels: {', '.join(labels)}
Description:
{body}
{focus_section}
## WORKFLOW
1. Clone: git clone {clone_url} /tmp/{worker_id} 2>/dev/null || (cd /tmp/{worker_id} && git fetch origin && git checkout main && git pull)
2. cd /tmp/{worker_id}
3. Create branch: git checkout -b {branch}
4. READ THE CODE. Understand the architecture before writing anything.
5. Implement the fix/feature/solution.
6. BUILD VALIDATION:
- Syntax check: node --check <file>.js for any JS changed
- If package.json exists: npm install --legacy-peer-deps 2>/dev/null && npm run build 2>&1
- If build fails: FIX THE BUILD. No broken PRs.
- Ensure git diff shows meaningful changes (>0 lines)
7. Commit: git add -A && git commit -m "fix: {title} (closes #{num})"
8. Push: git push origin {branch}
9. Create PR via API:
curl -s -X POST '{GITEA_URL}/api/v1/repos/{repo}/pulls' \\
-H 'Authorization: token $(cat ~/.config/gitea/token)' \\
-H 'Content-Type: application/json' \\
-d '{{"title":"fix: {title}","head":"{branch}","base":"main","body":"Closes #{num}\\n\\nAutomated by mimo-v2-pro swarm.\\n\\n## Changes\\n- [describe what you changed]\\n\\n## Validation\\n- [x] Syntax check passed\\n- [x] Build passes (if applicable)"}}'
## HARD RULES
- NEVER exit without committing. Even partial progress must be committed.
- NEVER create new issues. Only work on issue #{num}.
- NEVER push to main. Only push to your branch.
- NEVER push a broken build. Fix it or abandon with clear notes.
- If too complex: commit WIP, push, PR body says "WIP — needs human review"
- If build fails and you can't fix: commit anyway, push, PR body says "Build failed — needs human fix"
Worker: {worker_id}
"""
# ── Main ────────────────────────────────────────────────────────────────
def dispatch(token):
"""Main dispatch loop."""
state = load_state()
dispatched = 0
log("=" * 60)
log("MIMO DISPATCHER — scanning for work")
# Clean stale claims first
stale = []
for claim_id, claim in list(state["active_claims"].items()):
started = datetime.fromisoformat(claim["started"])
age = datetime.now(timezone.utc) - started
if age > timedelta(minutes=CLAIM_TIMEOUT_MINUTES):
stale.append(claim_id)
for claim_id in stale:
claim = state["active_claims"].pop(claim_id)
log(f" EXPIRED claim: {claim['repo']}#{claim['issue']} [{claim['lane']}]")
state["stats"]["total_released"] += 1
# Prefetch PR refs once (avoids N API calls in is_claimed)
target_repo = FOCUS_REPO if FOCUS_MODE else PRIORITY_REPOS[0]
prefetch_pr_refs(target_repo, token)
log(f" Prefetched {len(_PR_REFS)} PR references")
# FOCUS MODE: scan only the focus repo. FIREHOSE: scan all.
if FOCUS_MODE:
ordered = [FOCUS_REPO]
log(f" FOCUS MODE: targeting {FOCUS_REPO} only")
else:
repos = get_repos(token)
repo_names = [r["full_name"] for r in repos]
ordered = []
for pr in PRIORITY_REPOS:
if pr in repo_names:
ordered.append(pr)
for rn in repo_names:
if rn not in ordered:
ordered.append(rn)
# Scan each repo and collect all issues for priority sorting
all_issues = []
for repo_name in ordered[:20 if not FOCUS_MODE else 1]:
issues = get_open_issues(repo_name, token)
for issue in issues:
issue["_repo_name"] = repo_name # Tag with repo
all_issues.append(issue)
# Sort by priority score (highest first)
all_issues.sort(key=priority_score, reverse=True)
for issue in all_issues:
repo_name = issue["_repo_name"]
# Skip if already claimed in state
claim_key = f"{repo_name}#{issue['number']}"
if claim_key in state["active_claims"]:
continue
# Skip if claimed in Gitea
if is_claimed(issue, repo_name, token):
continue
# Detect lane
lane = detect_lane(issue)
# Check lane capacity
active_in_lane = count_active_in_lane(state, lane)
max_in_lane = MAX_WORKERS_PER_LANE.get(lane, 1)
if active_in_lane >= max_in_lane:
continue # Lane full, skip
# Claim and spawn
claim_issue(issue, repo_name, lane, token)
worker_id = spawn_worker(issue, repo_name, lane, token)
state["active_claims"][claim_key] = {
"repo": repo_name,
"issue": issue["number"],
"lane": lane,
"worker_id": worker_id,
"started": datetime.now(timezone.utc).isoformat(),
}
state["stats"]["total_dispatched"] += 1
dispatched += 1
max_dispatch = 35 if FOCUS_MODE else 25
if dispatched >= max_dispatch:
break
save_state(state)
# Summary
active = len(state["active_claims"])
log(f"Dispatch complete: {dispatched} new, {active} active, {state['stats']['total_dispatched']} total dispatched")
log(f"Active by lane: CODE={count_active_in_lane(state,'CODE')}, BUILD={count_active_in_lane(state,'BUILD')}, RESEARCH={count_active_in_lane(state,'RESEARCH')}, CREATE={count_active_in_lane(state,'CREATE')}")
return dispatched
if __name__ == "__main__":
token = load_token()
dispatched = dispatch(token)
sys.exit(0 if dispatched >= 0 else 1)

157
mimo-swarm/scripts/mimo-worker.sh Executable file
View File

@@ -0,0 +1,157 @@
#!/bin/bash
# Mimo Swarm Worker — One-shot execution
# Receives a prompt file, runs mimo-v2-pro via hermes, handles the git workflow.
#
# Usage: mimo-worker.sh <prompt_file>
# The prompt file contains all instructions for the worker.
set -euo pipefail
PROMPT_FILE="${1:?Usage: mimo-worker.sh <prompt_file>}"
WORKER_ID=$(basename "$PROMPT_FILE" .txt | sed 's/prompt-//')
LOG_DIR="$HOME/.hermes/mimo-swarm/logs"
LOG_FILE="$LOG_DIR/worker-${WORKER_ID}.log"
STATE_DIR="$HOME/.hermes/mimo-swarm/state"
GITEA_URL="https://forge.alexanderwhitestone.com"
TOKEN=$(cat "$HOME/.config/gitea/token")
log() {
echo "[$(date -u +%Y-%m-%dT%H:%M:%SZ)] $*" | tee -a "$LOG_FILE"
}
# Read the prompt
if [ ! -f "$PROMPT_FILE" ]; then
log "ERROR: Prompt file not found: $PROMPT_FILE"
exit 1
fi
PROMPT=$(cat "$PROMPT_FILE")
log "WORKER START: $WORKER_ID"
# Extract repo and issue from prompt
REPO=$(echo "$PROMPT" | grep "^Repository:" | head -1 | awk '{print $2}')
ISSUE_NUM=$(echo "$PROMPT" | grep "^Issue:" | head -1 | awk '{print $2}' | tr -d '#')
LANE=$(echo "$WORKER_ID" | cut -d- -f2)
BRANCH="mimo/${LANE}/issue-${ISSUE_NUM}"
WORK_DIR="/tmp/${WORKER_ID}"
log " Repo: $REPO | Issue: #$ISSUE_NUM | Branch: $BRANCH"
# Clone the repo
mkdir -p "$(dirname "$WORK_DIR")"
if [ -d "$WORK_DIR" ]; then
log " Pulling existing clone..."
cd "$WORK_DIR"
git fetch origin main 2>/dev/null || true
git checkout main 2>/dev/null || git checkout master 2>/dev/null || true
git pull 2>/dev/null || true
else
log " Cloning..."
CLONE_URL="${GITEA_URL}/${REPO}.git"
git clone "$CLONE_URL" "$WORK_DIR" 2>>"$LOG_FILE"
cd "$WORK_DIR"
fi
# Create branch
git checkout -b "$BRANCH" 2>/dev/null || git checkout "$BRANCH"
log " On branch: $BRANCH"
# Run mimo via hermes
log " Dispatching to mimo-v2-pro..."
hermes chat -q "$PROMPT" --provider nous -m xiaomi/mimo-v2-pro --yolo -t terminal,code_execution -Q >>"$LOG_FILE" 2>&1
MIMO_EXIT=$?
log " Mimo exited with code: $MIMO_EXIT"
# Quality gate
log " Running quality gate..."
# Check if there are changes
CHANGES=$(git diff --stat 2>/dev/null || echo "")
STAGED=$(git status --porcelain 2>/dev/null || echo "")
if [ -z "$CHANGES" ] && [ -z "$STAGED" ]; then
log " QUALITY GATE: No changes detected. Worker produced nothing."
# Try to salvage - maybe changes were committed already
COMMITS=$(git log main..HEAD --oneline 2>/dev/null | wc -l | tr -d ' ')
if [ "$COMMITS" -gt 0 ]; then
log " SALVAGE: Found $COMMITS commit(s) on branch. Proceeding to push."
else
log " ABANDON: No commits, no changes. Nothing to salvage."
cd /tmp
rm -rf "$WORK_DIR"
# Write release state
echo "{\"status\":\"abandoned\",\"reason\":\"no_changes\",\"worker\":\"$WORKER_ID\",\"issue\":$ISSUE_NUM}" > "$STATE_DIR/result-${WORKER_ID}.json"
exit 0
fi
else
# Syntax check for Python files
PY_FILES=$(find . -name "*.py" -newer .git/HEAD 2>/dev/null | head -20)
for pyf in $PY_FILES; do
if ! python3 -m py_compile "$pyf" 2>>"$LOG_FILE"; then
log " SYNTAX ERROR in $pyf — attempting fix or committing anyway"
fi
done
# Syntax check for JS files
JS_FILES=$(find . -name "*.js" -newer .git/HEAD 2>/dev/null | head -20)
for jsf in $JS_FILES; do
if ! node --check "$jsf" 2>>"$LOG_FILE"; then
log " SYNTAX ERROR in $jsf — attempting fix or committing anyway"
fi
done
# Diff size check
DIFF_LINES=$(git diff --stat | tail -1 | grep -oP '\d+ insertion' | grep -oP '\d+' || echo "0")
if [ "$DIFF_LINES" -gt 500 ]; then
log " WARNING: Large diff ($DIFF_LINES insertions). Committing but flagging for review."
fi
# Commit
git add -A
COMMIT_MSG="fix: $(echo "$PROMPT" | grep '^Title:' | sed 's/^Title: //') (closes #${ISSUE_NUM})"
git commit -m "$COMMIT_MSG" 2>>"$LOG_FILE" || log " Nothing to commit (already clean)"
fi
# Push
log " Pushing branch..."
PUSH_OUTPUT=$(git push origin "$BRANCH" 2>&1) || {
log " Push failed, trying force push..."
git push -f origin "$BRANCH" 2>>"$LOG_FILE" || log " Push failed completely"
}
log " Pushed: $PUSH_OUTPUT"
# Create PR
log " Creating PR..."
PR_TITLE="fix: $(echo "$PROMPT" | grep '^Title:' | sed 's/^Title: //')"
PR_BODY="Closes #${ISSUE_NUM}
Automated by mimo-v2-pro swarm worker.
Worker: ${WORKER_ID}"
PR_RESPONSE=$(curl -s -X POST "${GITEA_URL}/api/v1/repos/${REPO}/pulls" \
-H "Authorization: token ${TOKEN}" \
-H "Content-Type: application/json" \
-d "{\"title\":\"${PR_TITLE}\",\"head\":\"${BRANCH}\",\"base\":\"main\",\"body\":\"${PR_BODY}\"}" 2>>"$LOG_FILE")
PR_NUM=$(echo "$PR_RESPONSE" | python3 -c "import sys,json; print(json.load(sys.stdin).get('number','?'))" 2>/dev/null || echo "?")
log " PR created: #${PR_NUM}"
# Clean up
cd /tmp
# Keep work dir for debugging, clean later
# Write result
cat > "$STATE_DIR/result-${WORKER_ID}.json" <<EOF
{
"status": "completed",
"worker": "$WORKER_ID",
"repo": "$REPO",
"issue": $ISSUE_NUM,
"branch": "$BRANCH",
"pr": $PR_NUM,
"mimo_exit": $MIMO_EXIT,
"timestamp": "$(date -u +%Y-%m-%dT%H:%M:%SZ)"
}
EOF
log "WORKER COMPLETE: $WORKER_ID → PR #${PR_NUM}"

View File

@@ -0,0 +1,224 @@
#!/usr/bin/env python3
"""
Worker Runner — actual worker that picks up prompts and runs mimo via hermes CLI.
This is what the cron jobs SHOULD call instead of asking the LLM to check files.
"""
import os
import sys
import glob
import subprocess
import json
from datetime import datetime, timezone
STATE_DIR = os.path.expanduser("~/.hermes/mimo-swarm/state")
LOG_DIR = os.path.expanduser("~/.hermes/mimo-swarm/logs")
def log(msg):
ts = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
print(f"[{ts}] {msg}")
log_file = os.path.join(LOG_DIR, f"runner-{datetime.now().strftime('%Y%m%d')}.log")
with open(log_file, "a") as f:
f.write(f"[{ts}] {msg}\n")
def get_oldest_prompt():
"""Get the oldest prompt file with file locking (atomic rename)."""
prompts = sorted(glob.glob(os.path.join(STATE_DIR, "prompt-*.txt")))
if not prompts:
return None
# Prefer non-review prompts
impl = [p for p in prompts if "review" not in os.path.basename(p)]
target = impl[0] if impl else prompts[0]
# Atomic claim: rename to .processing
claimed = target + ".processing"
try:
os.rename(target, claimed)
return claimed
except OSError:
# Another worker got it first
return None
def run_worker(prompt_file):
"""Run the worker: read prompt, execute via hermes, create PR."""
worker_id = os.path.basename(prompt_file).replace("prompt-", "").replace(".txt", "")
with open(prompt_file) as f:
prompt = f.read()
# Extract repo and issue from prompt
repo = None
issue = None
for line in prompt.split("\n"):
if line.startswith("Repository:"):
repo = line.split(":", 1)[1].strip()
if line.startswith("Issue:"):
issue = line.split("#", 1)[1].strip() if "#" in line else line.split(":", 1)[1].strip()
log(f"Worker {worker_id}: repo={repo}, issue={issue}")
if not repo or not issue:
log(f" SKIPPING: couldn't parse repo/issue from prompt")
os.remove(prompt_file)
return False
# Clone/pull the repo — unique workspace per worker
import tempfile
work_dir = tempfile.mkdtemp(prefix=f"mimo-{worker_id}-")
clone_url = f"https://forge.alexanderwhitestone.com/{repo}.git"
branch = f"mimo/{worker_id.split('-')[1] if '-' in worker_id else 'code'}/issue-{issue}"
log(f" Workspace: {work_dir}")
result = subprocess.run(
["git", "clone", clone_url, work_dir],
capture_output=True, text=True, timeout=120
)
if result.returncode != 0:
log(f" CLONE FAILED: {result.stderr[:200]}")
os.remove(prompt_file)
return False
# Checkout branch
subprocess.run(["git", "fetch", "origin", "main"], cwd=work_dir, capture_output=True, timeout=60)
subprocess.run(["git", "checkout", "main"], cwd=work_dir, capture_output=True, timeout=30)
subprocess.run(["git", "pull"], cwd=work_dir, capture_output=True, timeout=30)
subprocess.run(["git", "checkout", "-b", branch], cwd=work_dir, capture_output=True, timeout=30)
# Run mimo via hermes CLI
log(f" Dispatching to hermes (nous/mimo-v2-pro)...")
result = subprocess.run(
["hermes", "chat", "-q", prompt, "--provider", "nous", "-m", "xiaomi/mimo-v2-pro",
"--yolo", "-t", "terminal,code_execution", "-Q"],
capture_output=True, text=True, timeout=900, # 15 min timeout
cwd=work_dir
)
log(f" Hermes exit: {result.returncode}")
log(f" Output: {result.stdout[-500:]}")
# Check for changes
status = subprocess.run(
["git", "status", "--porcelain"],
capture_output=True, text=True, cwd=work_dir
)
if not status.stdout.strip():
# Check for commits
log_count = subprocess.run(
["git", "log", "main..HEAD", "--oneline"],
capture_output=True, text=True, cwd=work_dir
)
if not log_count.stdout.strip():
log(f" NO CHANGES — abandoning")
# Release the claim
token = open(os.path.expanduser("~/.config/gitea/token")).read().strip()
import urllib.request
try:
req = urllib.request.Request(
f"https://forge.alexanderwhitestone.com/api/v1/repos/{repo}/issues/{issue}/labels/mimo-claimed",
headers={"Authorization": f"token {token}"},
method="DELETE"
)
urllib.request.urlopen(req, timeout=10)
except:
pass
if os.path.exists(prompt_file):
os.remove(prompt_file)
return False
# Commit dirty files (salvage)
if status.stdout.strip():
subprocess.run(["git", "add", "-A"], cwd=work_dir, capture_output=True, timeout=30)
subprocess.run(
["git", "commit", "-m", f"WIP: issue #{issue} (mimo swarm)"],
cwd=work_dir, capture_output=True, timeout=30
)
# Push
log(f" Pushing {branch}...")
push = subprocess.run(
["git", "push", "origin", branch],
capture_output=True, text=True, cwd=work_dir, timeout=60
)
if push.returncode != 0:
log(f" Push failed, trying force...")
subprocess.run(
["git", "push", "-f", "origin", branch],
capture_output=True, text=True, cwd=work_dir, timeout=60
)
# Create PR via API
token = open(os.path.expanduser("~/.config/gitea/token")).read().strip()
import urllib.request
# Get issue title
try:
req = urllib.request.Request(
f"https://forge.alexanderwhitestone.com/api/v1/repos/{repo}/issues/{issue}",
headers={"Authorization": f"token {token}", "Accept": "application/json"}
)
with urllib.request.urlopen(req, timeout=15) as resp:
issue_data = json.loads(resp.read())
title = issue_data.get("title", f"Issue #{issue}")
except:
title = f"Issue #{issue}"
pr_body = json.dumps({
"title": f"fix: {title}",
"head": branch,
"base": "main",
"body": f"Closes #{issue}\n\nAutomated by mimo-v2-pro swarm.\nWorker: {worker_id}"
}).encode()
try:
req = urllib.request.Request(
f"https://forge.alexanderwhitestone.com/api/v1/repos/{repo}/pulls",
data=pr_body,
headers={
"Authorization": f"token {token}",
"Content-Type": "application/json"
},
method="POST"
)
with urllib.request.urlopen(req, timeout=30) as resp:
pr_data = json.loads(resp.read())
pr_num = pr_data.get("number", "?")
log(f" PR CREATED: #{pr_num}")
except Exception as e:
log(f" PR FAILED: {e}")
pr_num = "?"
# Write result
result_file = os.path.join(STATE_DIR, f"result-{worker_id}.json")
with open(result_file, "w") as f:
json.dump({
"status": "completed",
"worker": worker_id,
"repo": repo,
"issue": int(issue) if issue.isdigit() else issue,
"branch": branch,
"pr": pr_num,
"timestamp": datetime.now(timezone.utc).isoformat()
}, f)
# Remove prompt
# Remove prompt file (handles .processing extension)
if os.path.exists(prompt_file):
os.remove(prompt_file)
log(f" DONE — prompt removed")
return True
if __name__ == "__main__":
prompt = get_oldest_prompt()
if not prompt:
print("No prompts in queue")
sys.exit(0)
print(f"Processing: {os.path.basename(prompt)}")
success = run_worker(prompt)
sys.exit(0 if success else 1)

View File

@@ -0,0 +1,404 @@
// ═══════════════════════════════════════════
// PROJECT MNEMOSYNE — AMBIENT PARTICLE SYSTEM
// ═══════════════════════════════════════════
//
// Memory activity visualization via Three.js Points.
// Three particle modes:
// 1. Spawn burst — 20 particles on new fact, 2s fade
// 2. Access trail — 10 particles streaming to crystal
// 3. Ambient dust — 200 particles, slow cosmic drift
//
// Category colors for all particles.
// Total budget: < 500 particles at any time.
//
// Usage from app.js:
// import { MemoryParticles } from './nexus/components/memory-particles.js';
// MemoryParticles.init(scene);
// MemoryParticles.onMemoryPlaced(position, category);
// MemoryParticles.onMemoryAccessed(fromPos, toPos, category);
// MemoryParticles.update(delta);
// ═══════════════════════════════════════════
const MemoryParticles = (() => {
let _scene = null;
let _initialized = false;
// ─── CATEGORY COLORS ──────────────────────
const CATEGORY_COLORS = {
engineering: new THREE.Color(0x4af0c0),
social: new THREE.Color(0x7b5cff),
knowledge: new THREE.Color(0xffd700),
projects: new THREE.Color(0xff4466),
working: new THREE.Color(0x00ff88),
archive: new THREE.Color(0x334455),
user_pref: new THREE.Color(0xffd700),
project: new THREE.Color(0x4488ff),
tool_knowledge: new THREE.Color(0x44ff88),
general: new THREE.Color(0x8899aa),
};
const DEFAULT_COLOR = new THREE.Color(0x8899bb);
// ─── PARTICLE BUDGETS ─────────────────────
const MAX_BURST_PARTICLES = 20; // per spawn event
const MAX_TRAIL_PARTICLES = 10; // per access event
const AMBIENT_COUNT = 200; // always-on dust
const MAX_ACTIVE_BURSTS = 8; // max concurrent burst groups
const MAX_ACTIVE_TRAILS = 5; // max concurrent trail groups
// ─── ACTIVE PARTICLE GROUPS ───────────────
let _bursts = []; // { points, velocities, life, maxLife }
let _trails = []; // { points, velocities, life, maxLife, target }
let _ambientPoints = null;
// ─── HELPERS ──────────────────────────────
function _getCategoryColor(category) {
return CATEGORY_COLORS[category] || DEFAULT_COLOR;
}
// ═══ AMBIENT DUST ═════════════════════════
function _createAmbient() {
const geo = new THREE.BufferGeometry();
const positions = new Float32Array(AMBIENT_COUNT * 3);
const colors = new Float32Array(AMBIENT_COUNT * 3);
const sizes = new Float32Array(AMBIENT_COUNT);
// Distribute across the world
for (let i = 0; i < AMBIENT_COUNT; i++) {
positions[i * 3] = (Math.random() - 0.5) * 50;
positions[i * 3 + 1] = Math.random() * 18 + 1;
positions[i * 3 + 2] = (Math.random() - 0.5) * 50;
// Subtle category-tinted colors
const categories = Object.keys(CATEGORY_COLORS);
const cat = categories[Math.floor(Math.random() * categories.length)];
const col = _getCategoryColor(cat).clone().multiplyScalar(0.4 + Math.random() * 0.3);
colors[i * 3] = col.r;
colors[i * 3 + 1] = col.g;
colors[i * 3 + 2] = col.b;
sizes[i] = 0.02 + Math.random() * 0.04;
}
geo.setAttribute('position', new THREE.BufferAttribute(positions, 3));
geo.setAttribute('color', new THREE.BufferAttribute(colors, 3));
geo.setAttribute('size', new THREE.BufferAttribute(sizes, 1));
const mat = new THREE.ShaderMaterial({
uniforms: { uTime: { value: 0 } },
vertexShader: `
attribute float size;
attribute vec3 color;
varying vec3 vColor;
varying float vAlpha;
uniform float uTime;
void main() {
vColor = color;
vec3 pos = position;
// Slow cosmic drift
pos.x += sin(uTime * 0.08 + position.y * 0.3) * 0.5;
pos.y += sin(uTime * 0.05 + position.z * 0.2) * 0.3;
pos.z += cos(uTime * 0.06 + position.x * 0.25) * 0.4;
vec4 mv = modelViewMatrix * vec4(pos, 1.0);
gl_PointSize = size * 250.0 / -mv.z;
gl_Position = projectionMatrix * mv;
// Fade with distance
vAlpha = smoothstep(40.0, 10.0, -mv.z) * 0.5;
}
`,
fragmentShader: `
varying vec3 vColor;
varying float vAlpha;
void main() {
float d = length(gl_PointCoord - 0.5);
if (d > 0.5) discard;
float alpha = smoothstep(0.5, 0.05, d);
gl_FragColor = vec4(vColor, alpha * vAlpha);
}
`,
transparent: true,
depthWrite: false,
blending: THREE.AdditiveBlending,
});
_ambientPoints = new THREE.Points(geo, mat);
_scene.add(_ambientPoints);
}
// ═══ BURST EFFECT ═════════════════════════
function _createBurst(position, category) {
const count = MAX_BURST_PARTICLES;
const geo = new THREE.BufferGeometry();
const positions = new Float32Array(count * 3);
const colors = new Float32Array(count * 3);
const sizes = new Float32Array(count);
const velocities = [];
const col = _getCategoryColor(category);
for (let i = 0; i < count; i++) {
positions[i * 3] = position.x;
positions[i * 3 + 1] = position.y;
positions[i * 3 + 2] = position.z;
colors[i * 3] = col.r;
colors[i * 3 + 1] = col.g;
colors[i * 3 + 2] = col.b;
sizes[i] = 0.06 + Math.random() * 0.06;
// Random outward velocity
const theta = Math.random() * Math.PI * 2;
const phi = Math.random() * Math.PI;
const speed = 1.5 + Math.random() * 2.5;
velocities.push(
Math.sin(phi) * Math.cos(theta) * speed,
Math.cos(phi) * speed * 0.8 + 1.0, // bias upward
Math.sin(phi) * Math.sin(theta) * speed
);
}
geo.setAttribute('position', new THREE.BufferAttribute(positions, 3));
geo.setAttribute('color', new THREE.BufferAttribute(colors, 3));
geo.setAttribute('size', new THREE.BufferAttribute(sizes, 1));
const mat = new THREE.ShaderMaterial({
uniforms: { uOpacity: { value: 1.0 } },
vertexShader: `
attribute float size;
attribute vec3 color;
varying vec3 vColor;
uniform float uOpacity;
void main() {
vColor = color;
vec4 mv = modelViewMatrix * vec4(position, 1.0);
gl_PointSize = size * 300.0 / -mv.z;
gl_Position = projectionMatrix * mv;
}
`,
fragmentShader: `
varying vec3 vColor;
uniform float uOpacity;
void main() {
float d = length(gl_PointCoord - 0.5);
if (d > 0.5) discard;
float alpha = smoothstep(0.5, 0.05, d);
gl_FragColor = vec4(vColor, alpha * uOpacity);
}
`,
transparent: true,
depthWrite: false,
blending: THREE.AdditiveBlending,
});
const points = new THREE.Points(geo, mat);
_scene.add(points);
_bursts.push({
points,
velocities,
life: 0,
maxLife: 2.0, // 2s fade
});
// Cap active bursts
while (_bursts.length > MAX_ACTIVE_BURSTS) {
_removeBurst(0);
}
}
function _removeBurst(idx) {
const burst = _bursts[idx];
if (burst.points.parent) burst.points.parent.remove(burst.points);
burst.points.geometry.dispose();
burst.points.material.dispose();
_bursts.splice(idx, 1);
}
// ═══ TRAIL EFFECT ═════════════════════════
function _createTrail(fromPos, toPos, category) {
const count = MAX_TRAIL_PARTICLES;
const geo = new THREE.BufferGeometry();
const positions = new Float32Array(count * 3);
const colors = new Float32Array(count * 3);
const sizes = new Float32Array(count);
const velocities = [];
const col = _getCategoryColor(category);
for (let i = 0; i < count; i++) {
// Stagger start positions along the path
const t = Math.random();
positions[i * 3] = fromPos.x + (toPos.x - fromPos.x) * t + (Math.random() - 0.5) * 0.5;
positions[i * 3 + 1] = fromPos.y + (toPos.y - fromPos.y) * t + (Math.random() - 0.5) * 0.5;
positions[i * 3 + 2] = fromPos.z + (toPos.z - fromPos.z) * t + (Math.random() - 0.5) * 0.5;
colors[i * 3] = col.r;
colors[i * 3 + 1] = col.g;
colors[i * 3 + 2] = col.b;
sizes[i] = 0.04 + Math.random() * 0.04;
// Velocity toward target with slight randomness
const dx = toPos.x - fromPos.x;
const dy = toPos.y - fromPos.y;
const dz = toPos.z - fromPos.z;
const len = Math.sqrt(dx * dx + dy * dy + dz * dz) || 1;
const speed = 2.0 + Math.random() * 1.5;
velocities.push(
(dx / len) * speed + (Math.random() - 0.5) * 0.5,
(dy / len) * speed + (Math.random() - 0.5) * 0.5,
(dz / len) * speed + (Math.random() - 0.5) * 0.5
);
}
geo.setAttribute('position', new THREE.BufferAttribute(positions, 3));
geo.setAttribute('color', new THREE.BufferAttribute(colors, 3));
geo.setAttribute('size', new THREE.BufferAttribute(sizes, 1));
const mat = new THREE.ShaderMaterial({
uniforms: { uOpacity: { value: 1.0 } },
vertexShader: `
attribute float size;
attribute vec3 color;
varying vec3 vColor;
uniform float uOpacity;
void main() {
vColor = color;
vec4 mv = modelViewMatrix * vec4(position, 1.0);
gl_PointSize = size * 280.0 / -mv.z;
gl_Position = projectionMatrix * mv;
}
`,
fragmentShader: `
varying vec3 vColor;
uniform float uOpacity;
void main() {
float d = length(gl_PointCoord - 0.5);
if (d > 0.5) discard;
float alpha = smoothstep(0.5, 0.05, d);
gl_FragColor = vec4(vColor, alpha * uOpacity);
}
`,
transparent: true,
depthWrite: false,
blending: THREE.AdditiveBlending,
});
const points = new THREE.Points(geo, mat);
_scene.add(points);
_trails.push({
points,
velocities,
life: 0,
maxLife: 1.5, // 1.5s trail
target: toPos.clone(),
});
// Cap active trails
while (_trails.length > MAX_ACTIVE_TRAILS) {
_removeTrail(0);
}
}
function _removeTrail(idx) {
const trail = _trails[idx];
if (trail.points.parent) trail.points.parent.remove(trail.points);
trail.points.geometry.dispose();
trail.points.material.dispose();
_trails.splice(idx, 1);
}
// ═══ PUBLIC API ═══════════════════════════
function init(scene) {
_scene = scene;
_initialized = true;
_createAmbient();
console.info('[Mnemosyne] Ambient particle system initialized —', AMBIENT_COUNT, 'dust particles');
}
function onMemoryPlaced(position, category) {
if (!_initialized) return;
const pos = position instanceof THREE.Vector3 ? position : new THREE.Vector3(position.x, position.y, position.z);
_createBurst(pos, category);
}
function onMemoryAccessed(fromPosition, toPosition, category) {
if (!_initialized) return;
const from = fromPosition instanceof THREE.Vector3 ? fromPosition : new THREE.Vector3(fromPosition.x, fromPosition.y, fromPosition.z);
const to = toPosition instanceof THREE.Vector3 ? toPosition : new THREE.Vector3(toPosition.x, toPosition.y, toPosition.z);
_createTrail(from, to, category);
}
function update(delta) {
if (!_initialized) return;
// Update ambient dust
if (_ambientPoints && _ambientPoints.material.uniforms) {
_ambientPoints.material.uniforms.uTime.value += delta;
}
// Update bursts
for (let i = _bursts.length - 1; i >= 0; i--) {
const burst = _bursts[i];
burst.life += delta;
const t = burst.life / burst.maxLife;
if (t >= 1.0) {
_removeBurst(i);
continue;
}
const pos = burst.points.geometry.attributes.position.array;
for (let j = 0; j < MAX_BURST_PARTICLES; j++) {
pos[j * 3] += burst.velocities[j * 3] * delta;
pos[j * 3 + 1] += burst.velocities[j * 3 + 1] * delta;
pos[j * 3 + 2] += burst.velocities[j * 3 + 2] * delta;
// Gravity + drag
burst.velocities[j * 3 + 1] -= delta * 0.5;
burst.velocities[j * 3] *= 0.98;
burst.velocities[j * 3 + 1] *= 0.98;
burst.velocities[j * 3 + 2] *= 0.98;
}
burst.points.geometry.attributes.position.needsUpdate = true;
burst.points.material.uniforms.uOpacity.value = 1.0 - t;
}
// Update trails
for (let i = _trails.length - 1; i >= 0; i--) {
const trail = _trails[i];
trail.life += delta;
const t = trail.life / trail.maxLife;
if (t >= 1.0) {
_removeTrail(i);
continue;
}
const pos = trail.points.geometry.attributes.position.array;
for (let j = 0; j < MAX_TRAIL_PARTICLES; j++) {
pos[j * 3] += trail.velocities[j * 3] * delta;
pos[j * 3 + 1] += trail.velocities[j * 3 + 1] * delta;
pos[j * 3 + 2] += trail.velocities[j * 3 + 2] * delta;
}
trail.points.geometry.attributes.position.needsUpdate = true;
trail.points.material.uniforms.uOpacity.value = 1.0 - t * t;
}
}
function getActiveParticleCount() {
let total = AMBIENT_COUNT;
_bursts.forEach(b => { total += MAX_BURST_PARTICLES; });
_trails.forEach(t => { total += MAX_TRAIL_PARTICLES; });
return total;
}
return {
init,
onMemoryPlaced,
onMemoryAccessed,
update,
getActiveParticleCount,
};
})();
export { MemoryParticles };

View File

@@ -32,6 +32,9 @@
const SpatialMemory = (() => {
// ─── CALLBACKS ────────────────────────────────────────
let _onMemoryPlacedCallback = null;
// ─── REGION DEFINITIONS ───────────────────────────────
const REGIONS = {
engineering: {
@@ -140,6 +143,47 @@ const SpatialMemory = (() => {
return new THREE.OctahedronGeometry(size, 0);
}
// ─── TRUST-BASED VISUALS ─────────────────────────────
// Wire crystal visual properties to fact trust score (0.0-1.0).
// Issue #1166: Trust > 0.8 = bright glow/full opacity,
// 0.5-0.8 = medium/80%, < 0.5 = dim/40%, < 0.3 = near-invisible pulsing red.
function _getTrustVisuals(trust, regionColor) {
const t = Math.max(0, Math.min(1, trust));
if (t >= 0.8) {
return {
opacity: 1.0,
emissiveIntensity: 2.0 * t,
emissiveColor: regionColor,
lightIntensity: 1.2,
glowDesc: 'high'
};
} else if (t >= 0.5) {
return {
opacity: 0.8,
emissiveIntensity: 1.2 * t,
emissiveColor: regionColor,
lightIntensity: 0.6,
glowDesc: 'medium'
};
} else if (t >= 0.3) {
return {
opacity: 0.4,
emissiveIntensity: 0.5 * t,
emissiveColor: regionColor,
lightIntensity: 0.2,
glowDesc: 'dim'
};
} else {
return {
opacity: 0.15,
emissiveIntensity: 0.3,
emissiveColor: 0xff2200,
lightIntensity: 0.1,
glowDesc: 'untrusted'
};
}
}
// ─── REGION MARKER ───────────────────────────────────
function createRegionMarker(regionKey, region) {
const cx = region.center[0];
@@ -216,17 +260,20 @@ const SpatialMemory = (() => {
const region = REGIONS[mem.category] || REGIONS.working;
const pos = mem.position || _assignPosition(mem.category, mem.id);
const strength = Math.max(0.05, Math.min(1, mem.strength != null ? mem.strength : 0.7));
const trust = mem.trust != null ? Math.max(0, Math.min(1, mem.trust)) : 0.7;
const size = 0.2 + strength * 0.3;
const tv = _getTrustVisuals(trust, region.color);
const geo = createCrystalGeometry(size);
const mat = new THREE.MeshStandardMaterial({
color: region.color,
emissive: region.color,
emissiveIntensity: 1.5 * strength,
emissive: tv.emissiveColor,
emissiveIntensity: tv.emissiveIntensity,
metalness: 0.6,
roughness: 0.15,
transparent: true,
opacity: 0.5 + strength * 0.4
opacity: tv.opacity
});
const crystal = new THREE.Mesh(geo, mat);
@@ -239,10 +286,12 @@ const SpatialMemory = (() => {
region: mem.category,
pulse: Math.random() * Math.PI * 2,
strength: strength,
trust: trust,
glowDesc: tv.glowDesc,
createdAt: mem.timestamp || new Date().toISOString()
};
const light = new THREE.PointLight(region.color, 0.8 * strength, 5);
const light = new THREE.PointLight(tv.emissiveColor, tv.lightIntensity, 5);
crystal.add(light);
_scene.add(crystal);
@@ -255,6 +304,12 @@ const SpatialMemory = (() => {
_dirty = true;
saveToStorage();
console.info('[Mnemosyne] Spatial memory placed:', mem.id, 'in', region.label);
// Fire particle burst callback
if (_onMemoryPlacedCallback) {
_onMemoryPlacedCallback(crystal.position.clone(), mem.category || 'working');
}
return crystal;
}
@@ -337,8 +392,16 @@ const SpatialMemory = (() => {
mesh.scale.setScalar(pulse);
if (mesh.material) {
const trust = mesh.userData.trust != null ? mesh.userData.trust : 0.7;
const base = mesh.userData.strength || 0.7;
mesh.material.emissiveIntensity = 1.0 + Math.sin(mesh.userData.pulse * 0.7) * 0.5 * base;
if (trust < 0.3) {
// Low trust: pulsing red — visible warning
const pulseAlpha = 0.15 + Math.sin(mesh.userData.pulse * 2.0) * 0.15;
mesh.material.emissiveIntensity = 0.3 + Math.sin(mesh.userData.pulse * 2.0) * 0.3;
mesh.material.opacity = pulseAlpha;
} else {
mesh.material.emissiveIntensity = 1.0 + Math.sin(mesh.userData.pulse * 0.7) * 0.5 * base;
}
}
});
@@ -368,6 +431,42 @@ const SpatialMemory = (() => {
return REGIONS;
}
// ─── UPDATE VISUAL PROPERTIES ────────────────────────
// Re-render crystal when trust/strength change (no position move).
function updateMemoryVisual(memId, updates) {
const obj = _memoryObjects[memId];
if (!obj) return false;
const mesh = obj.mesh;
const region = REGIONS[obj.region] || REGIONS.working;
if (updates.trust != null) {
const trust = Math.max(0, Math.min(1, updates.trust));
mesh.userData.trust = trust;
obj.data.trust = trust;
const tv = _getTrustVisuals(trust, region.color);
mesh.material.emissive = new THREE.Color(tv.emissiveColor);
mesh.material.emissiveIntensity = tv.emissiveIntensity;
mesh.material.opacity = tv.opacity;
mesh.userData.glowDesc = tv.glowDesc;
if (mesh.children.length > 0 && mesh.children[0].isPointLight) {
mesh.children[0].intensity = tv.lightIntensity;
mesh.children[0].color = new THREE.Color(tv.emissiveColor);
}
}
if (updates.strength != null) {
const strength = Math.max(0.05, Math.min(1, updates.strength));
mesh.userData.strength = strength;
obj.data.strength = strength;
}
_dirty = true;
saveToStorage();
console.info('[Mnemosyne] Visual updated:', memId, 'trust:', mesh.userData.trust, 'glow:', mesh.userData.glowDesc);
return true;
}
// ─── QUERY ───────────────────────────────────────────
function getMemoryAtPosition(position, maxDist) {
maxDist = maxDist || 2;
@@ -507,6 +606,7 @@ const SpatialMemory = (() => {
source: o.data.source || 'unknown',
timestamp: o.data.timestamp || o.mesh.userData.createdAt,
strength: o.mesh.userData.strength || 0.7,
trust: o.mesh.userData.trust != null ? o.mesh.userData.trust : 0.7,
connections: o.data.connections || []
}))
};
@@ -734,13 +834,91 @@ const SpatialMemory = (() => {
});
}
// ─── SPATIAL SEARCH (issue #1170) ────────────────────
let _searchOriginalState = {}; // memId -> { emissiveIntensity, opacity } for restore
function searchContent(query) {
if (!query || !query.trim()) return [];
const q = query.toLowerCase().trim();
const matches = [];
Object.values(_memoryObjects).forEach(obj => {
const d = obj.data;
const searchable = [
d.content || '',
d.id || '',
d.category || '',
d.source || '',
...(d.connections || [])
].join(' ').toLowerCase();
if (searchable.includes(q)) {
matches.push(d.id);
}
});
return matches;
}
function highlightSearchResults(matchIds) {
// Save original state and apply search highlighting
_searchOriginalState = {};
const matchSet = new Set(matchIds);
Object.entries(_memoryObjects).forEach(([id, obj]) => {
const mat = obj.mesh.material;
_searchOriginalState[id] = {
emissiveIntensity: mat.emissiveIntensity,
opacity: mat.opacity
};
if (matchSet.has(id)) {
// Match: bright white glow
mat.emissive.setHex(0xffffff);
mat.emissiveIntensity = 5.0;
mat.opacity = 1.0;
} else {
// Non-match: dim to 10% opacity
mat.opacity = 0.1;
mat.emissiveIntensity = 0.2;
}
});
}
function clearSearch() {
Object.entries(_memoryObjects).forEach(([id, obj]) => {
const mat = obj.mesh.material;
const saved = _searchOriginalState[id];
if (saved) {
// Restore original emissive color from region
const region = REGIONS[obj.region] || REGIONS.working;
mat.emissive.copy(region.color);
mat.emissiveIntensity = saved.emissiveIntensity;
mat.opacity = saved.opacity;
}
});
_searchOriginalState = {};
}
function getSearchMatchPosition(matchId) {
const obj = _memoryObjects[matchId];
return obj ? obj.mesh.position.clone() : null;
}
function setOnMemoryPlaced(callback) {
_onMemoryPlacedCallback = callback;
}
return {
init, placeMemory, removeMemory, update,
init, placeMemory, removeMemory, update, updateMemoryVisual,
getMemoryAtPosition, getRegionAtPosition, getMemoriesInRegion, getAllMemories,
getCrystalMeshes, getMemoryFromMesh, highlightMemory, clearHighlight, getSelectedId,
exportIndex, importIndex, exportToFile, importFromFile, searchNearby, REGIONS,
saveToStorage, loadFromStorage, clearStorage,
runGravityLayout
runGravityLayout,
searchContent, highlightSearchResults, clearSearch, getSearchMatchPosition,
setOnMemoryPlaced
};
})();

View File

@@ -0,0 +1,205 @@
// ═══════════════════════════════════════════
// PROJECT MNEMOSYNE — TIMELINE SCRUBBER
// ═══════════════════════════════════════════
//
// Horizontal timeline bar overlay for scrolling through fact history.
// Crystals outside the visible time window fade out.
//
// Issue: #1169
// ═══════════════════════════════════════════
const TimelineScrubber = (() => {
let _container = null;
let _bar = null;
let _handle = null;
let _labels = null;
let _spatialMemory = null;
let _rangeStart = 0; // 0-1 normalized
let _rangeEnd = 1; // 0-1 normalized
let _minTimestamp = null;
let _maxTimestamp = null;
let _active = false;
const PRESETS = {
'hour': { label: 'Last Hour', ms: 3600000 },
'day': { label: 'Last Day', ms: 86400000 },
'week': { label: 'Last Week', ms: 604800000 },
'all': { label: 'All Time', ms: Infinity }
};
// ─── INIT ──────────────────────────────────────────
function init(spatialMemory) {
_spatialMemory = spatialMemory;
_buildDOM();
_computeTimeRange();
console.info('[Mnemosyne] Timeline scrubber initialized');
}
function _buildDOM() {
_container = document.createElement('div');
_container.id = 'mnemosyne-timeline';
_container.style.cssText = `
position: fixed; bottom: 0; left: 0; right: 0; height: 48px;
background: rgba(5, 5, 16, 0.85); border-top: 1px solid #1a2a4a;
z-index: 1000; display: flex; align-items: center; padding: 0 16px;
font-family: monospace; font-size: 12px; color: #8899aa;
backdrop-filter: blur(8px); transition: opacity 0.3s;
`;
// Preset buttons
const presetDiv = document.createElement('div');
presetDiv.style.cssText = 'display: flex; gap: 8px; margin-right: 16px;';
Object.entries(PRESETS).forEach(([key, preset]) => {
const btn = document.createElement('button');
btn.textContent = preset.label;
btn.style.cssText = `
background: #0a0f28; border: 1px solid #1a2a4a; color: #4af0c0;
padding: 4px 8px; cursor: pointer; font-family: monospace; font-size: 11px;
border-radius: 3px; transition: background 0.2s;
`;
btn.onmouseenter = () => btn.style.background = '#1a2a4a';
btn.onmouseleave = () => btn.style.background = '#0a0f28';
btn.onclick = () => _applyPreset(key);
presetDiv.appendChild(btn);
});
_container.appendChild(presetDiv);
// Timeline bar
_bar = document.createElement('div');
_bar.style.cssText = `
flex: 1; height: 20px; background: #0a0f28; border: 1px solid #1a2a4a;
border-radius: 3px; position: relative; cursor: pointer; margin: 0 8px;
`;
// Handle (draggable range selector)
_handle = document.createElement('div');
_handle.style.cssText = `
position: absolute; top: 0; left: 0%; width: 100%; height: 100%;
background: rgba(74, 240, 192, 0.15); border-left: 2px solid #4af0c0;
border-right: 2px solid #4af0c0; cursor: ew-resize;
`;
_bar.appendChild(_handle);
_container.appendChild(_bar);
// Labels
_labels = document.createElement('div');
_labels.style.cssText = 'min-width: 200px; text-align: right; font-size: 11px;';
_labels.textContent = 'All Time';
_container.appendChild(_labels);
// Drag handling
let dragging = null;
_handle.addEventListener('mousedown', (e) => {
dragging = { startX: e.clientX, startLeft: parseFloat(_handle.style.left) || 0, startWidth: parseFloat(_handle.style.width) || 100 };
e.preventDefault();
});
document.addEventListener('mousemove', (e) => {
if (!dragging) return;
const barRect = _bar.getBoundingClientRect();
const dx = (e.clientX - dragging.startX) / barRect.width * 100;
let newLeft = Math.max(0, Math.min(100 - dragging.startWidth, dragging.startLeft + dx));
_handle.style.left = newLeft + '%';
_rangeStart = newLeft / 100;
_rangeEnd = (newLeft + dragging.startWidth) / 100;
_applyFilter();
});
document.addEventListener('mouseup', () => { dragging = null; });
document.body.appendChild(_container);
}
function _computeTimeRange() {
if (!_spatialMemory) return;
const memories = _spatialMemory.getAllMemories();
if (memories.length === 0) return;
let min = Infinity, max = -Infinity;
memories.forEach(m => {
const t = new Date(m.timestamp || 0).getTime();
if (t < min) min = t;
if (t > max) max = t;
});
_minTimestamp = min;
_maxTimestamp = max;
}
function _applyPreset(key) {
const preset = PRESETS[key];
if (!preset) return;
if (preset.ms === Infinity) {
_rangeStart = 0;
_rangeEnd = 1;
} else {
const now = Date.now();
const range = _maxTimestamp - _minTimestamp;
if (range <= 0) return;
const cutoff = now - preset.ms;
_rangeStart = Math.max(0, (cutoff - _minTimestamp) / range);
_rangeEnd = 1;
}
_handle.style.left = (_rangeStart * 100) + '%';
_handle.style.width = ((_rangeEnd - _rangeStart) * 100) + '%';
_labels.textContent = preset.label;
_applyFilter();
}
function _applyFilter() {
if (!_spatialMemory) return;
const range = _maxTimestamp - _minTimestamp;
if (range <= 0) return;
const startMs = _minTimestamp + range * _rangeStart;
const endMs = _minTimestamp + range * _rangeEnd;
_spatialMemory.getCrystalMeshes().forEach(mesh => {
const ts = new Date(mesh.userData.createdAt || 0).getTime();
if (ts >= startMs && ts <= endMs) {
mesh.visible = true;
// Smooth restore
if (mesh.material) mesh.material.opacity = mesh.userData._savedOpacity || mesh.material.opacity;
} else {
// Fade out
if (mesh.material) {
mesh.userData._savedOpacity = mesh.userData._savedOpacity || mesh.material.opacity;
mesh.material.opacity = 0.02;
}
}
});
// Update label with date range
const startStr = new Date(startMs).toLocaleDateString();
const endStr = new Date(endMs).toLocaleDateString();
_labels.textContent = startStr + ' — ' + endStr;
}
function update() {
_computeTimeRange();
}
function show() {
if (_container) _container.style.display = 'flex';
_active = true;
}
function hide() {
if (_container) _container.style.display = 'none';
_active = false;
// Restore all crystals
if (_spatialMemory) {
_spatialMemory.getCrystalMeshes().forEach(mesh => {
mesh.visible = true;
if (mesh.material && mesh.userData._savedOpacity) {
mesh.material.opacity = mesh.userData._savedOpacity;
}
});
}
}
function isActive() { return _active; }
return { init, update, show, hide, isActive };
})();
export { TimelineScrubber };

62
provenance.json Normal file
View File

@@ -0,0 +1,62 @@
{
"generated_at": "2026-04-11T01:14:54.632326+00:00",
"repo": "Timmy_Foundation/the-nexus",
"git": {
"commit": "d408d2c365a9efc0c1e3a9b38b9cc4eed75695c5",
"branch": "mimo/build/issue-686",
"remote": "https://forge.alexanderwhitestone.com/Timmy_Foundation/the-nexus.git",
"dirty": true
},
"files": {
"index.html": {
"sha256": "71ba27afe8b6b42a09efe09d2b3017599392ddc3bc02543b31c2277dfb0b82cc",
"size": 25933
},
"app.js": {
"sha256": "2b765a724a0fcda29abd40ba921bc621d2699f11d0ba14cf1579cbbdafdc5cd5",
"size": 132902
},
"style.css": {
"sha256": "cd3068d03eed6f52a00bbc32cfae8fba4739b8b3cb194b3ec09fd747a075056d",
"size": 44198
},
"gofai_worker.js": {
"sha256": "d292f110aa12a8aa2b16b0c2d48e5b4ce24ee15b1cffb409ab846b1a05a91de2",
"size": 969
},
"server.py": {
"sha256": "e963cc9715accfc8814e3fe5c44af836185d66740d5a65fd0365e9c629d38e05",
"size": 4185
},
"portals.json": {
"sha256": "889a5e0f724eb73a95f960bca44bca232150bddff7c1b11f253bd056f3683a08",
"size": 3442
},
"vision.json": {
"sha256": "0e3b5c06af98486bbcb2fc2dc627dc8b7b08aed4c3a4f9e10b57f91e1e8ca6ad",
"size": 1658
},
"manifest.json": {
"sha256": "352304c4f7746f5d31cbc223636769969dd263c52800645c01024a3a8489d8c9",
"size": 495
},
"nexus/components/spatial-memory.js": {
"sha256": "60170f6490ddd743acd6d285d3a1af6cad61fbf8aaef3f679ff4049108eac160",
"size": 32782
},
"nexus/components/session-rooms.js": {
"sha256": "9997a60dda256e38cb4645508bf9e98c15c3d963b696e0080e3170a9a7fa7cf1",
"size": 15113
},
"nexus/components/timeline-scrubber.js": {
"sha256": "f8a17762c2735be283dc5074b13eb00e1e3b2b04feb15996c2cf0323b46b6014",
"size": 7177
},
"nexus/components/memory-particles.js": {
"sha256": "1be5567a3ebb229f9e1a072c08a25387ade87cb4a1df6a624e5c5254d3bef8fa",
"size": 14216
}
},
"missing": [],
"file_count": 12
}

View File

@@ -1880,3 +1880,84 @@ canvas#nexus-canvas {
text-transform: uppercase;
}
/* ═══ SPATIAL SEARCH OVERLAY (Mnemosyne #1170) ═══ */
.spatial-search-overlay {
position: fixed;
top: 12px;
right: 12px;
z-index: 100;
display: flex;
flex-direction: column;
align-items: flex-end;
font-family: 'JetBrains Mono', monospace;
}
.spatial-search-input {
width: 260px;
padding: 8px 14px;
background: rgba(0, 0, 0, 0.65);
border: 1px solid rgba(74, 240, 192, 0.3);
border-radius: 6px;
color: #e0f0ff;
font-family: 'JetBrains Mono', monospace;
font-size: 13px;
outline: none;
backdrop-filter: blur(8px);
transition: border-color 0.2s, box-shadow 0.2s;
}
.spatial-search-input:focus {
border-color: rgba(74, 240, 192, 0.7);
box-shadow: 0 0 12px rgba(74, 240, 192, 0.15);
}
.spatial-search-input::placeholder {
color: rgba(224, 240, 255, 0.35);
}
.spatial-search-results {
margin-top: 4px;
max-height: 200px;
overflow-y: auto;
background: rgba(0, 0, 0, 0.55);
border: 1px solid rgba(74, 240, 192, 0.15);
border-radius: 4px;
font-size: 11px;
color: #a0c0d0;
width: 260px;
backdrop-filter: blur(8px);
display: none;
}
.spatial-search-results.visible {
display: block;
}
.spatial-search-result-item {
padding: 5px 10px;
cursor: pointer;
border-bottom: 1px solid rgba(74, 240, 192, 0.08);
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
}
.spatial-search-result-item:hover {
background: rgba(74, 240, 192, 0.1);
color: #e0f0ff;
}
.spatial-search-result-item .result-region {
color: #4af0c0;
font-size: 9px;
margin-right: 6px;
}
.spatial-search-count {
padding: 4px 10px;
color: rgba(74, 240, 192, 0.6);
font-size: 10px;
border-bottom: 1px solid rgba(74, 240, 192, 0.1);
}

293
tests/test_browser_smoke.py Normal file
View File

@@ -0,0 +1,293 @@
"""
Browser smoke tests for the Nexus 3D world.
Uses Playwright to verify the DOM contract, Three.js initialization,
portal loading, and loading screen flow.
Refs: #686
"""
import json
import os
import subprocess
import time
from pathlib import Path
import pytest
from playwright.sync_api import sync_playwright, expect
REPO_ROOT = Path(__file__).resolve().parent.parent
SCREENSHOT_DIR = REPO_ROOT / "test-screenshots"
# ---------------------------------------------------------------------------
# Fixtures
# ---------------------------------------------------------------------------
@pytest.fixture(scope="module")
def http_server():
"""Start a simple HTTP server for the Nexus static files."""
import http.server
import threading
port = int(os.environ.get("NEXUS_TEST_PORT", "9876"))
handler = http.server.SimpleHTTPRequestHandler
server = http.server.HTTPServer(("127.0.0.1", port), handler)
thread = threading.Thread(target=server.serve_forever, daemon=True)
thread.start()
time.sleep(0.3)
yield f"http://127.0.0.1:{port}"
server.shutdown()
@pytest.fixture(scope="module")
def browser_page(http_server):
"""Launch a headless browser and navigate to the Nexus."""
SCREENSHOT_DIR.mkdir(exist_ok=True)
with sync_playwright() as pw:
browser = pw.chromium.launch(
headless=True,
args=["--no-sandbox", "--disable-gpu"],
)
context = browser.new_context(
viewport={"width": 1280, "height": 720},
ignore_https_errors=True,
)
page = context.new_page()
# Collect console errors
console_errors = []
page.on("console", lambda msg: console_errors.append(msg.text) if msg.type == "error" else None)
page.goto(http_server, wait_until="domcontentloaded", timeout=30000)
page._console_errors = console_errors
yield page
browser.close()
# ---------------------------------------------------------------------------
# Static asset tests
# ---------------------------------------------------------------------------
class TestStaticAssets:
"""Verify all contract files are serveable."""
REQUIRED_FILES = [
"index.html",
"app.js",
"style.css",
"portals.json",
"vision.json",
"manifest.json",
"gofai_worker.js",
]
def test_index_html_served(self, http_server):
"""index.html must return 200."""
import urllib.request
resp = urllib.request.urlopen(f"{http_server}/index.html")
assert resp.status == 200
@pytest.mark.parametrize("filename", REQUIRED_FILES)
def test_contract_file_served(self, http_server, filename):
"""Each contract file must return 200."""
import urllib.request
try:
resp = urllib.request.urlopen(f"{http_server}/{filename}")
assert resp.status == 200
except Exception as e:
pytest.fail(f"{filename} not serveable: {e}")
# ---------------------------------------------------------------------------
# DOM contract tests
# ---------------------------------------------------------------------------
class TestDOMContract:
"""Verify required DOM elements exist after page load."""
REQUIRED_ELEMENTS = {
"nexus-canvas": "canvas",
"hud": "div",
"chat-panel": "div",
"chat-input": "input",
"chat-messages": "div",
"chat-send": "button",
"chat-toggle": "button",
"debug-overlay": "div",
"nav-mode-label": "span",
"ws-status-dot": "span",
"hud-location-text": "span",
"portal-hint": "div",
"spatial-search": "div",
}
@pytest.mark.parametrize("element_id,tag", list(REQUIRED_ELEMENTS.items()))
def test_element_exists(self, browser_page, element_id, tag):
"""Element with given ID must exist in the DOM."""
el = browser_page.query_selector(f"#{element_id}")
assert el is not None, f"#{element_id} ({tag}) missing from DOM"
def test_canvas_has_webgl(self, browser_page):
"""The nexus-canvas must have a WebGL rendering context."""
has_webgl = browser_page.evaluate("""
() => {
const c = document.getElementById('nexus-canvas');
if (!c) return false;
const ctx = c.getContext('webgl2') || c.getContext('webgl');
return ctx !== null;
}
""")
assert has_webgl, "nexus-canvas has no WebGL context"
def test_title_contains_nexus(self, browser_page):
"""Page title should reference The Nexus."""
title = browser_page.title()
assert "nexus" in title.lower() or "timmy" in title.lower(), f"Unexpected title: {title}"
# ---------------------------------------------------------------------------
# Loading flow tests
# ---------------------------------------------------------------------------
class TestLoadingFlow:
"""Verify the loading screen → enter prompt → HUD flow."""
def test_loading_screen_transitions(self, browser_page):
"""Loading screen should fade out and HUD should become visible."""
# Wait for loading to complete and enter prompt to appear
try:
browser_page.wait_for_selector("#enter-prompt", state="visible", timeout=15000)
except Exception:
# Enter prompt may have already appeared and been clicked
pass
# Try clicking the enter prompt if it exists
enter = browser_page.query_selector("#enter-prompt")
if enter and enter.is_visible():
enter.click()
time.sleep(1)
# HUD should now be visible
hud = browser_page.query_selector("#hud")
assert hud is not None, "HUD element missing"
# After enter, HUD display should not be 'none'
display = browser_page.evaluate("() => document.getElementById('hud').style.display")
assert display != "none", "HUD should be visible after entering"
# ---------------------------------------------------------------------------
# Three.js initialization tests
# ---------------------------------------------------------------------------
class TestThreeJSInit:
"""Verify Three.js initialized properly."""
def test_three_loaded(self, browser_page):
"""THREE namespace should be available (via import map)."""
# Three.js is loaded as ES module, check for canvas context instead
has_canvas = browser_page.evaluate("""
() => {
const c = document.getElementById('nexus-canvas');
return c && c.width > 0 && c.height > 0;
}
""")
assert has_canvas, "Canvas not properly initialized"
def test_canvas_dimensions(self, browser_page):
"""Canvas should fill the viewport."""
dims = browser_page.evaluate("""
() => {
const c = document.getElementById('nexus-canvas');
return { width: c.width, height: c.height, ww: window.innerWidth, wh: window.innerHeight };
}
""")
assert dims["width"] > 0, "Canvas width is 0"
assert dims["height"] > 0, "Canvas height is 0"
# ---------------------------------------------------------------------------
# Data contract tests
# ---------------------------------------------------------------------------
class TestDataContract:
"""Verify JSON data files are valid and well-formed."""
def test_portals_json_valid(self):
"""portals.json must parse as a non-empty JSON array."""
data = json.loads((REPO_ROOT / "portals.json").read_text())
assert isinstance(data, list), "portals.json must be an array"
assert len(data) > 0, "portals.json must have at least one portal"
def test_portals_have_required_fields(self):
"""Each portal must have id, name, status, destination."""
data = json.loads((REPO_ROOT / "portals.json").read_text())
required = {"id", "name", "status", "destination"}
for i, portal in enumerate(data):
missing = required - set(portal.keys())
assert not missing, f"Portal {i} missing fields: {missing}"
def test_vision_json_valid(self):
"""vision.json must parse as valid JSON."""
data = json.loads((REPO_ROOT / "vision.json").read_text())
assert data is not None
def test_manifest_json_valid(self):
"""manifest.json must have required PWA fields."""
data = json.loads((REPO_ROOT / "manifest.json").read_text())
for key in ["name", "start_url", "theme_color"]:
assert key in data, f"manifest.json missing '{key}'"
# ---------------------------------------------------------------------------
# Screenshot / visual proof
# ---------------------------------------------------------------------------
class TestVisualProof:
"""Capture screenshots as visual validation evidence."""
def test_screenshot_initial_state(self, browser_page):
"""Take a screenshot of the initial page state."""
path = SCREENSHOT_DIR / "smoke-initial.png"
browser_page.screenshot(path=str(path))
assert path.exists(), "Screenshot was not saved"
assert path.stat().st_size > 1000, "Screenshot seems empty"
def test_screenshot_after_enter(self, browser_page):
"""Take a screenshot after clicking through the enter prompt."""
enter = browser_page.query_selector("#enter-prompt")
if enter and enter.is_visible():
enter.click()
time.sleep(2)
else:
time.sleep(1)
path = SCREENSHOT_DIR / "smoke-post-enter.png"
browser_page.screenshot(path=str(path))
assert path.exists()
def test_screenshot_fullscreen(self, browser_page):
"""Full-page screenshot for visual regression baseline."""
path = SCREENSHOT_DIR / "smoke-fullscreen.png"
browser_page.screenshot(path=str(path), full_page=True)
assert path.exists()
# ---------------------------------------------------------------------------
# Provenance in browser context
# ---------------------------------------------------------------------------
class TestBrowserProvenance:
"""Verify provenance from within the browser context."""
def test_page_served_from_correct_origin(self, http_server):
"""The page must be served from localhost, not a stale remote."""
import urllib.request
resp = urllib.request.urlopen(f"{http_server}/index.html")
content = resp.read().decode("utf-8", errors="replace")
# Must not contain references to legacy matrix path
assert "/Users/apayne/the-matrix" not in content, \
"index.html references legacy matrix path — provenance violation"
def test_index_html_has_nexus_title(self, http_server):
"""index.html title must reference The Nexus."""
import urllib.request
resp = urllib.request.urlopen(f"{http_server}/index.html")
content = resp.read().decode("utf-8", errors="replace")
assert "<title>The Nexus" in content or "Timmy" in content, \
"index.html title does not reference The Nexus"

73
tests/test_provenance.py Normal file
View File

@@ -0,0 +1,73 @@
"""
Provenance tests — verify the Nexus browser surface comes from
a clean Timmy_Foundation/the-nexus checkout, not stale sources.
Refs: #686
"""
import json
import hashlib
from pathlib import Path
REPO_ROOT = Path(__file__).resolve().parent.parent
def test_provenance_manifest_exists() -> None:
"""provenance.json must exist and be valid JSON."""
p = REPO_ROOT / "provenance.json"
assert p.exists(), "provenance.json missing — run bin/generate_provenance.py"
data = json.loads(p.read_text())
assert "files" in data
assert "repo" in data
def test_provenance_repo_identity() -> None:
"""Manifest must claim Timmy_Foundation/the-nexus."""
data = json.loads((REPO_ROOT / "provenance.json").read_text())
assert data["repo"] == "Timmy_Foundation/the-nexus"
def test_provenance_all_contract_files_present() -> None:
"""Every file listed in the provenance manifest must exist on disk."""
data = json.loads((REPO_ROOT / "provenance.json").read_text())
missing = []
for rel in data["files"]:
if not (REPO_ROOT / rel).exists():
missing.append(rel)
assert not missing, f"Contract files missing: {missing}"
def test_provenance_hashes_match() -> None:
"""File hashes must match the stored manifest (no stale/modified files)."""
data = json.loads((REPO_ROOT / "provenance.json").read_text())
mismatches = []
for rel, meta in data["files"].items():
p = REPO_ROOT / rel
if not p.exists():
mismatches.append(f"MISSING: {rel}")
continue
actual = hashlib.sha256(p.read_bytes()).hexdigest()
if actual != meta["sha256"]:
mismatches.append(f"CHANGED: {rel}")
assert not mismatches, f"Provenance mismatch:\n" + "\n".join(mismatches)
def test_no_legacy_matrix_references_in_frontend() -> None:
"""Frontend files must not reference /Users/apayne/the-matrix as a source."""
forbidden_paths = ["/Users/apayne/the-matrix"]
offenders = []
for rel in ["index.html", "app.js", "style.css"]:
p = REPO_ROOT / rel
if p.exists():
content = p.read_text()
for bad in forbidden_paths:
if bad in content:
offenders.append(f"{rel} references {bad}")
assert not offenders, f"Legacy matrix references found: {offenders}"
def test_no_stale_perplexity_computer_references_in_critical_files() -> None:
"""Verify the provenance generator script itself is canonical."""
script = REPO_ROOT / "bin" / "generate_provenance.py"
assert script.exists(), "bin/generate_provenance.py must exist"
content = script.read_text()
assert "Timmy_Foundation/the-nexus" in content