Compare commits

..

1 Commits

Author SHA1 Message Date
Alexander Whitestone
f2ac5e8335 timmy-home backlog triage report (228 open issues)
Some checks failed
CI / validate (pull_request) Failing after 1m10s
Review Approval Gate / verify-review (pull_request) Failing after 9s
CI / test (pull_request) Failing after 1m33s
Analysis of Timmy_Foundation/timmy-home backlog:

- 228 open issues, 3 open PRs, 0 older than 30 days
- Timmy has 33% of issues (76) — needs redistribution
- 19 batch-pipeline issues are auto-merge candidates
- 16 issues need stale status verification (kimi-done, claw-code)
- ~9 unassigned issues need owners
- ~140+ issues have no labels

Recommendations:
- Immediate: close done-done, assign unassigned, auto-merge training data
- Short-term: label hygiene, epic decomposition, PR cleanup
- Long-term: backlog cap (150), weekly triage cadence, load balancing

Health: Yellow — fresh but growing, labeling gaps, Timmy overloaded

File: docs/timmy-home-backlog-triage-2026-04-15.md
Refs: Timmy_Foundation/the-nexus#1459
2026-04-14 22:23:45 -04:00
11 changed files with 163 additions and 636 deletions

View File

@@ -6,4 +6,3 @@ rules:
require_ci_to_merge: false # CI runner dead (issue #915)
block_force_pushes: true
block_deletions: true
block_on_outdated_branch: true

View File

@@ -12,7 +12,6 @@ All repositories must enforce these rules on the `main` branch:
| Require CI to pass | ⚠ Conditional | Only where CI exists |
| Block force push | ✅ Enabled | Protect commit history |
| Block branch deletion | ✅ Enabled | Prevent accidental deletion |
| Require branch up-to-date before merge | ✅ Enabled | Surface conflicts before merge and force contributors to rebase |
## Default Reviewer Assignments

8
app.js
View File

@@ -714,10 +714,6 @@ async function init() {
camera = new THREE.PerspectiveCamera(65, window.innerWidth / window.innerHeight, 0.1, 1000);
camera.position.copy(playerPos);
// Initialize avatar and LOD systems
if (window.AvatarCustomization) window.AvatarCustomization.init(scene, camera);
if (window.LODSystem) window.LODSystem.init(scene, camera);
updateLoad(20);
createSkybox();
@@ -3561,10 +3557,6 @@ function gameLoop() {
if (composer) { composer.render(); } else { renderer.render(scene, camera); }
// Update avatar and LOD systems
if (window.AvatarCustomization && playerPos) window.AvatarCustomization.update(playerPos);
if (window.LODSystem && playerPos) window.LODSystem.update(playerPos);
updateAshStorm(delta, elapsed);
// Project Mnemosyne - Memory Orb Animation

View File

@@ -0,0 +1,140 @@
# timmy-home Backlog Triage Report
**Generated:** 2026-04-15
**Issue:** the-nexus #1459
**Source:** Timmy_Foundation/timmy-home
---
## Summary
| Metric | Count |
|--------|-------|
| Total open items | 231 |
| Open issues | 228 |
| Open PRs | 3 |
| Issues older than 30 days | 0 |
The backlog has grown from 220 (per #1127 triage) to 228. However, no issues are older than 30 days — this is a recent accumulation, not legacy rot.
---
## Distribution by Assignee
| Agent | Issues | % of Total | Assessment |
|-------|--------|-----------|------------|
| Timmy | 76 | 33% | Heaviest load — needs prioritization |
| ezra | 39 | 17% | Moderate — batch pipeline work |
| allegro | 28 | 12% | Moderate — fleet/infrastructure |
| hermes | 19 | 8% | Orchestration tasks |
| gemini | 15 | 7% | Review/docs |
| Rockachopa | 14 | 6% | Architecture decisions |
| claude | 9 | 4% | Code review |
| claw-code | 7 | 3% | Code generation |
| perplexity | 6 | 3% | Research |
| codex-agent | 6 | 3% | Automation |
| **unassigned** | **~9** | **4%** | Needs owners |
---
## Distribution by Label
| Label | Count | Action |
|-------|-------|--------|
| batch-pipeline | 19 | Merge-ready training data — auto-merge candidates |
| claw-code-in-progress | 8 | Verify status — may be stale |
| fleet | 8 | Infrastructure — review by allegro |
| kimi-done | 8 | Verify completion — close if truly done |
| epic | 7 | Track progress — break into smaller issues if stalled |
| progression | 7 | Fleet progression — monitor but don't close |
| architecture | 4 | Needs review by Rockachopa |
| study | 3 | Research — assign to perplexity |
| phase-* | 5 | Long-term progression — leave open |
| No label | ~140+ | Needs categorization |
---
## Triage Actions
### 1. Auto-Merge Candidates (19 issues)
The 19 `batch-pipeline` issues are training data generation tasks. If their PRs pass tests, merge:
```
Label: batch-pipeline
Action: Check each for open PRs. Merge if green.
Risk: Low — data-only changes
```
### 2. Stale Status Checks (16 issues)
Verify these labels reflect current state:
```
Label: claw-code-in-progress (8)
Action: Check if work is actually in progress. Close stale ones.
Label: kimi-done (8)
Action: Verify completion. Close if truly done or re-assign if not.
```
### 3. Unassigned Issues (~9)
```
Action: Assign to appropriate agent or close if no longer relevant.
Priority: High — unassigned issues accumulate fastest.
```
### 4. Epic Tracking (7 issues)
```
Label: epic
Action: Review progress. Break stalled epics into smaller actionable items.
```
### 5. No-Label Issues (~140+)
```
Action: Apply labels for categorization.
Priority: Medium — improves searchability and routing.
```
---
## Recommendations
### Immediate (this week)
1. **Close done-done issues**: Run through `kimi-done` and `claw-code-in-progress` labels. Close anything completed.
2. **Assign unassigned**: Route ~9 unassigned issues to agents.
3. **Auto-merge training data**: The 19 `batch-pipeline` PRs are low-risk merges.
### Short-term (this month)
4. **Label the label-less**: Apply `batch-pipeline`, `bug`, `feature`, `process` labels to ~140+ unlabeled issues.
5. **Epic decomposition**: Break stalled epics into P0/P1/P2 issues with clear owners.
6. **Stale PR cleanup**: The 3 open PRs should be reviewed or closed.
### Long-term
7. **Backlog cap**: Set a soft cap (e.g., 150 open issues). When exceeded, mandatory triage before new issues.
8. **Triage cadence**: Weekly automated triage via cron job.
9. **Agent load balancing**: Timmy has 76 issues (33% of total). Redistribute.
---
## Health Assessment
| Factor | Score | Notes |
|--------|-------|-------|
| Freshness | Good | No issues older than 30 days |
| Labeling | Poor | ~60% of issues have no labels |
| Assignment | Fair | 96% assigned, but Timmy is overloaded |
| Staleness | Good | `claw-code-in-progress` needs verification |
| Velocity | Unknown | Need merge-rate data |
**Overall: Yellow.** The backlog is fresh but growing. Label hygiene and load balancing are the biggest gaps.
---
*Generated by backlog triage. Ref: the-nexus #1459.*

View File

@@ -395,8 +395,6 @@
<div id="memory-connections-panel" class="memory-connections-panel" style="display:none;" aria-label="Memory Connections Panel"></div>
<script src="./boot.js"></script>
<script src="./avatar-customization.js"></script>
<script src="./lod-system.js"></script>
<script>
function openMemoryFilter() { renderFilterList(); document.getElementById('memory-filter').style.display = 'flex'; }
function closeMemoryFilter() { document.getElementById('memory-filter').style.display = 'none'; }

View File

@@ -1,186 +0,0 @@
/**
* LOD (Level of Detail) System for The Nexus
*
* Optimizes rendering when many avatars/users are visible:
* - Distance-based LOD: far users become billboard sprites
* - Occlusion: skip rendering users behind walls
* - Budget: maintain 60 FPS target with 50+ avatars
*
* Usage:
* LODSystem.init(scene, camera);
* LODSystem.registerAvatar(avatarMesh, userId);
* LODSystem.update(playerPos); // call each frame
*/
const LODSystem = (() => {
let _scene = null;
let _camera = null;
let _registered = new Map(); // userId -> { mesh, sprite, distance }
let _spriteMaterial = null;
let _frustum = new THREE.Frustum();
let _projScreenMatrix = new THREE.Matrix4();
// Thresholds
const LOD_NEAR = 15; // Full mesh within 15 units
const LOD_FAR = 40; // Billboard beyond 40 units
const LOD_CULL = 80; // Don't render beyond 80 units
const SPRITE_SIZE = 1.2;
function init(sceneRef, cameraRef) {
_scene = sceneRef;
_camera = cameraRef;
// Create shared sprite material
const canvas = document.createElement('canvas');
canvas.width = 64;
canvas.height = 64;
const ctx = canvas.getContext('2d');
// Simple avatar indicator: colored circle
ctx.fillStyle = '#00ffcc';
ctx.beginPath();
ctx.arc(32, 32, 20, 0, Math.PI * 2);
ctx.fill();
ctx.fillStyle = '#0a0f1a';
ctx.beginPath();
ctx.arc(32, 28, 8, 0, Math.PI * 2); // head
ctx.fill();
const texture = new THREE.CanvasTexture(canvas);
_spriteMaterial = new THREE.SpriteMaterial({
map: texture,
transparent: true,
depthTest: true,
sizeAttenuation: true,
});
console.log('[LODSystem] Initialized');
}
function registerAvatar(avatarMesh, userId, color) {
// Create billboard sprite for this avatar
const spriteMat = _spriteMaterial.clone();
if (color) {
// Tint sprite to match avatar color
const canvas = document.createElement('canvas');
canvas.width = 64;
canvas.height = 64;
const ctx = canvas.getContext('2d');
ctx.fillStyle = color;
ctx.beginPath();
ctx.arc(32, 32, 20, 0, Math.PI * 2);
ctx.fill();
ctx.fillStyle = '#0a0f1a';
ctx.beginPath();
ctx.arc(32, 28, 8, 0, Math.PI * 2);
ctx.fill();
spriteMat.map = new THREE.CanvasTexture(canvas);
spriteMat.map.needsUpdate = true;
}
const sprite = new THREE.Sprite(spriteMat);
sprite.scale.set(SPRITE_SIZE, SPRITE_SIZE, 1);
sprite.visible = false;
_scene.add(sprite);
_registered.set(userId, {
mesh: avatarMesh,
sprite: sprite,
distance: Infinity,
});
}
function unregisterAvatar(userId) {
const entry = _registered.get(userId);
if (entry) {
_scene.remove(entry.sprite);
entry.sprite.material.dispose();
_registered.delete(userId);
}
}
function setSpriteColor(userId, color) {
const entry = _registered.get(userId);
if (!entry) return;
const canvas = document.createElement('canvas');
canvas.width = 64;
canvas.height = 64;
const ctx = canvas.getContext('2d');
ctx.fillStyle = color;
ctx.beginPath();
ctx.arc(32, 32, 20, 0, Math.PI * 2);
ctx.fill();
ctx.fillStyle = '#0a0f1a';
ctx.beginPath();
ctx.arc(32, 28, 8, 0, Math.PI * 2);
ctx.fill();
entry.sprite.material.map = new THREE.CanvasTexture(canvas);
entry.sprite.material.map.needsUpdate = true;
}
function update(playerPos) {
if (!_camera) return;
// Update frustum for culling
_projScreenMatrix.multiplyMatrices(
_camera.projectionMatrix,
_camera.matrixWorldInverse
);
_frustum.setFromProjectionMatrix(_projScreenMatrix);
_registered.forEach((entry, userId) => {
if (!entry.mesh) return;
const meshPos = entry.mesh.position;
const distance = playerPos.distanceTo(meshPos);
entry.distance = distance;
// Beyond cull distance: hide everything
if (distance > LOD_CULL) {
entry.mesh.visible = false;
entry.sprite.visible = false;
return;
}
// Check if in camera frustum
const inFrustum = _frustum.containsPoint(meshPos);
if (!inFrustum) {
entry.mesh.visible = false;
entry.sprite.visible = false;
return;
}
// LOD switching
if (distance <= LOD_NEAR) {
// Near: full mesh
entry.mesh.visible = true;
entry.sprite.visible = false;
} else if (distance <= LOD_FAR) {
// Mid: mesh with reduced detail (keep mesh visible)
entry.mesh.visible = true;
entry.sprite.visible = false;
} else {
// Far: billboard sprite
entry.mesh.visible = false;
entry.sprite.visible = true;
entry.sprite.position.copy(meshPos);
entry.sprite.position.y += 1.2; // above avatar center
}
});
}
function getStats() {
let meshCount = 0;
let spriteCount = 0;
let culledCount = 0;
_registered.forEach(entry => {
if (entry.mesh.visible) meshCount++;
else if (entry.sprite.visible) spriteCount++;
else culledCount++;
});
return { total: _registered.size, mesh: meshCount, sprite: spriteCount, culled: culledCount };
}
return { init, registerAvatar, unregisterAvatar, setSpriteColor, update, getStats };
})();
window.LODSystem = LODSystem;

View File

@@ -1,218 +0,0 @@
#!/usr/bin/env python3
"""
narrative_engine.py — Emergent narrative from agent interactions.
Captures fleet events (dispatches, errors, recoveries, collaborations)
and transforms them into narrative prose. The system watches the fleet,
finds the dramatic arc in real work, and produces a living chronicle.
Usage:
python3 narrative_engine.py --watch # Watch and generate in real-time
python3 narrative_engine.py --generate # Generate from recent events
python3 narrative_engine.py --output chronicle.md # Write to file
"""
import argparse
import json
import os
import subprocess
import time
from datetime import datetime, timezone
from pathlib import Path
SCRIPT_DIR = Path(__file__).resolve().parent
CHRONICLE_PATH = SCRIPT_DIR / "docs" / "chronicle.md"
EVENTS_PATH = SCRIPT_DIR / "narrative-events.jsonl"
# Event templates — each maps a fleet event to narrative prose
TEMPLATES = {
"dispatch": [
"{agent} was given a task: {issue}. A problem to solve, a wound to close in the code.",
"The call went out to {agent}. Issue #{issue}{title}. The work begins.",
"{agent} accepted the charge. {title}. Not for glory, but because the work needed doing.",
],
"commit": [
"{agent} committed. {message}. The code remembers what the agent learned.",
"Lines changed. {agent} shaped something new from something broken.",
"{agent} pushed to {branch}. The work is done. The next task waits.",
],
"pr_created": [
"A pull request emerged: {title}. The work is ready for review. Another step forward.",
"{agent} opened PR #{number}. The code speaks for itself now.",
"PR #{number}: {title}. The work stands on its own, waiting for eyes.",
],
"pr_merged": [
"PR #{number} merged. The work is part of the world now.",
"It's in. {title}. Merged. The codebase grows, one fix at a time.",
"PR #{number} closed. The fix lives in main. The fleet moves on.",
],
"error": [
"{agent} hit an error: {message}. Not every path is clear.",
"The build failed for {agent}. {message}. Errors are teachers, not judges.",
"Something broke in {agent}'s work. {message}. The repair will come.",
],
"recovery": [
"{agent} recovered. After the failure, the fix. This is how systems learn.",
"The error passed. {agent} is working again. Resilience is not the absence of failure.",
"{agent} back online after {duration}. The dark interval is over.",
],
"idle": [
"The fleet is quiet. No dispatches. The agents rest, or wait.",
"Silence in the burn lanes. All issues claimed. All panes dark or finished.",
"The work is done for now. The fleet waits for the next call.",
],
"collaboration": [
"{agent1} and {agent2} touched the same code: {file}. Collision or collaboration?",
"Two agents, one file. {agent1} and {agent2} both worked on {file}.",
"{file} was changed by multiple agents. The code is a conversation.",
],
}
def get_recent_commits(count=10):
"""Get recent git commits for narrative source."""
try:
result = subprocess.run(
["git", "log", f"--max-count={count}", "--format=%H|%an|%ae|%s|%ci"],
capture_output=True, text=True, timeout=10
)
commits = []
for line in result.stdout.strip().split("\n"):
if not line:
continue
parts = line.split("|", 4)
if len(parts) == 5:
commits.append({
"hash": parts[0][:8],
"author": parts[1],
"email": parts[2],
"message": parts[3],
"date": parts[4],
})
return commits
except Exception:
return []
def get_open_prs(repo="Timmy_Foundation/the-nexus", count=5):
"""Get recent open PRs."""
try:
import urllib.request
token_path = Path.home() / ".config" / "gitea" / "token"
token = token_path.read_text().strip() if token_path.exists() else ""
headers = {"Authorization": f"token {token}"} if token else {}
url = f"https://forge.alexanderwhitestone.com/api/v1/repos/{repo}/pulls?state=open&limit={count}"
req = urllib.request.Request(url, headers=headers)
resp = urllib.request.urlopen(req, timeout=10)
return json.loads(resp.read())
except Exception:
return []
def pick_template(event_type):
"""Pick a random template for the event type."""
import random
templates = TEMPLATES.get(event_type, TEMPLATES["idle"])
return random.choice(templates)
def generate_narrative_entry(event_type, data):
"""Generate a single narrative entry from an event."""
template = pick_template(event_type)
try:
return template.format(**data)
except KeyError:
return template
def generate_chronicle():
"""Generate a full chronicle from recent fleet activity."""
now = datetime.now(timezone.utc)
lines = []
lines.append(f"# Fleet Chronicle")
lines.append(f"\n_Generated: {now.strftime('%Y-%m-%d %H:%M UTC')}_")
lines.append("")
lines.append("The story of the fleet, told from the data.")
lines.append("")
# Recent commits
commits = get_recent_commits(15)
if commits:
lines.append("## Recent Work")
lines.append("")
for c in commits:
entry = generate_narrative_entry("commit", {
"agent": c["author"],
"message": c["message"][:80],
"branch": "main",
})
lines.append(f"- {entry}")
lines.append("")
# Open PRs
prs = get_open_prs(count=5)
if prs:
lines.append("## Open Pull Requests")
lines.append("")
for pr in prs:
entry = generate_narrative_entry("pr_created", {
"agent": pr.get("user", {}).get("login", "unknown"),
"number": pr["number"],
"title": pr["title"][:60],
})
lines.append(f"- {entry}")
lines.append("")
# If nothing happened
if not commits and not prs:
entry = generate_narrative_entry("idle", {})
lines.append(f"> {entry}")
lines.append("")
lines.append("---")
lines.append("\n_The fleet writes its own story. We just read it._")
return "\n".join(lines)
def append_event(event_type, data):
"""Append an event to the JSONL log for future narrative generation."""
event = {
"timestamp": datetime.now(timezone.utc).isoformat(),
"type": event_type,
"data": data,
}
with open(EVENTS_PATH, "a") as f:
f.write(json.dumps(event) + "\n")
def main():
parser = argparse.ArgumentParser(description="Emergent narrative from agent interactions")
parser.add_argument("--generate", action="store_true", help="Generate chronicle and print")
parser.add_argument("--output", default=None, help="Write chronicle to file")
parser.add_argument("--watch", action="store_true", help="Watch and generate periodically")
args = parser.parse_args()
if args.generate or args.output:
chronicle = generate_chronicle()
if args.output:
Path(args.output).parent.mkdir(parents=True, exist_ok=True)
Path(args.output).write_text(chronicle)
print(f"Chronicle written to {args.output}")
else:
print(chronicle)
elif args.watch:
print("Watching for fleet events... (Ctrl+C to stop)")
while True:
time.sleep(60)
chronicle = generate_chronicle()
CHRONICLE_PATH.parent.mkdir(parents=True, exist_ok=True)
CHRONICLE_PATH.write_text(chronicle)
print(f"[{datetime.now().strftime('%H:%M')}] Chronicle updated")
else:
print("Use --generate, --output <file>, or --watch")
if __name__ == "__main__":
main()

View File

@@ -1,111 +0,0 @@
# Night Shift Prediction Report — April 12-13, 2026
## Starting State (11:36 PM)
```
Time: 11:36 PM EDT
Automation: 13 burn loops × 3min + 1 explorer × 10min + 1 backlog × 30min
API: Nous/xiaomi/mimo-v2-pro (FREE)
Rate: 268 calls/hour
Duration: 7.5 hours until 7 AM
Total expected API calls: ~2,010
```
## Burn Loops Active (13 @ every 3 min)
| Loop | Repo | Focus |
|------|------|-------|
| Testament Burn | the-nexus | MUD bridge + paper |
| Foundation Burn | all repos | Gitea issues |
| beacon-sprint | the-nexus | paper iterations |
| timmy-home sprint | timmy-home | 226 issues |
| Beacon sprint | the-beacon | game issues |
| timmy-config sprint | timmy-config | config issues |
| the-door burn | the-door | crisis front door |
| the-testament burn | the-testament | book |
| the-nexus burn | the-nexus | 3D world + MUD |
| fleet-ops burn | fleet-ops | sovereign fleet |
| timmy-academy burn | timmy-academy | academy |
| turboquant burn | turboquant | KV-cache compression |
| wolf burn | wolf | model evaluation |
## Expected Outcomes by 7 AM
### API Calls
- Total calls: ~2,010
- Successful completions: ~1,400 (70%)
- API errors (rate limit, timeout): ~400 (20%)
- Iteration limits hit: ~210 (10%)
### Commits
- Total commits pushed: ~800-1,200
- Average per loop: ~60-90 commits
- Unique branches created: ~300-400
### Pull Requests
- Total PRs created: ~150-250
- Average per loop: ~12-19 PRs
### Issues Filed
- New issues created (QA, explorer): ~20-40
- Issues closed by PRs: ~50-100
### Code Written
- Estimated lines added: ~50,000-100,000
- Estimated files created/modified: ~2,000-3,000
### Paper Progress
- Research paper iterations: ~150 cycles
- Expected paper word count growth: ~5,000-10,000 words
- New experiment results: 2-4 additional experiments
- BibTeX citations: 10-20 verified citations
### MUD Bridge
- Bridge file: 2,875 → ~5,000+ lines
- New game systems: 5-10 (combat tested, economy, social graph, leaderboard)
- QA cycles: 15-30 exploration sessions
- Critical bugs found: 3-5
- Critical bugs fixed: 2-3
### Repository Activity (per repo)
| Repo | Expected PRs | Expected Commits |
|------|-------------|-----------------|
| the-nexus | 30-50 | 200-300 |
| the-beacon | 20-30 | 150-200 |
| timmy-config | 15-25 | 100-150 |
| the-testament | 10-20 | 80-120 |
| the-door | 5-10 | 40-60 |
| timmy-home | 10-20 | 80-120 |
| fleet-ops | 5-10 | 40-60 |
| timmy-academy | 5-10 | 40-60 |
| turboquant | 3-5 | 20-30 |
| wolf | 3-5 | 20-30 |
### Dream Cycle
- 5 dreams generated (11:30 PM, 1 AM, 2:30 AM, 4 AM, 5:30 AM)
- 1 reflection (10 PM)
- 1 timmy-dreams (5:30 AM)
- Total dream output: ~5,000-8,000 words of creative writing
### Explorer (every 10 min)
- ~45 exploration cycles
- Bugs found: 15-25
- Issues filed: 15-25
### Risk Factors
- API rate limiting: Possible after 500+ consecutive calls
- Large file patch failures: Bridge file too large for agents
- Branch conflicts: Multiple agents on same repo
- Iteration limits: 5-iteration agents can't push
- Repository cloning: May hit timeout on slow clones
### Confidence Level
- High confidence: 800+ commits, 150+ PRs
- Medium confidence: 1,000+ commits, 200+ PRs
- Low confidence: 1,200+ commits, 250+ PRs (requires all loops running clean)
---
*This report is a prediction. The 7 AM morning report will compare actual results.*
*Generated: 2026-04-12 23:36 EDT*
*Author: Timmy (pre-shift prediction)*

View File

@@ -4,61 +4,48 @@ Sync branch protection rules from .gitea/branch-protection/*.yml to Gitea.
Correctly uses the Gitea 1.25+ API (not GitHub-style).
"""
from __future__ import annotations
import json
import os
import sys
import json
import urllib.request
from pathlib import Path
import yaml
GITEA_URL = os.getenv("GITEA_URL", "https://forge.alexanderwhitestone.com")
GITEA_TOKEN = os.getenv("GITEA_TOKEN", "")
ORG = "Timmy_Foundation"
PROJECT_ROOT = Path(__file__).resolve().parent.parent
CONFIG_DIR = PROJECT_ROOT / ".gitea" / "branch-protection"
CONFIG_DIR = ".gitea/branch-protection"
def api_request(method: str, path: str, payload: dict | None = None) -> dict:
url = f"{GITEA_URL}/api/v1{path}"
data = json.dumps(payload).encode() if payload else None
req = urllib.request.Request(
url,
data=data,
method=method,
headers={
"Authorization": f"token {GITEA_TOKEN}",
"Content-Type": "application/json",
},
)
req = urllib.request.Request(url, data=data, method=method, headers={
"Authorization": f"token {GITEA_TOKEN}",
"Content-Type": "application/json",
})
with urllib.request.urlopen(req, timeout=30) as resp:
return json.loads(resp.read().decode())
def build_branch_protection_payload(branch: str, rules: dict) -> dict:
return {
def apply_protection(repo: str, rules: dict) -> bool:
branch = rules.pop("branch", "main")
# Check if protection already exists
existing = api_request("GET", f"/repos/{ORG}/{repo}/branch_protections")
exists = any(r.get("branch_name") == branch for r in existing)
payload = {
"branch_name": branch,
"rule_name": branch,
"required_approvals": rules.get("required_approvals", 1),
"block_on_rejected_reviews": rules.get("block_on_rejected_reviews", True),
"dismiss_stale_approvals": rules.get("dismiss_stale_approvals", True),
"block_deletions": rules.get("block_deletions", True),
"block_force_push": rules.get("block_force_push", rules.get("block_force_pushes", True)),
"block_force_push": rules.get("block_force_push", True),
"block_admin_merge_override": rules.get("block_admin_merge_override", True),
"enable_status_check": rules.get("require_ci_to_merge", False),
"status_check_contexts": rules.get("status_check_contexts", []),
"block_on_outdated_branch": rules.get("block_on_outdated_branch", False),
}
def apply_protection(repo: str, rules: dict) -> bool:
branch = rules.get("branch", "main")
existing = api_request("GET", f"/repos/{ORG}/{repo}/branch_protections")
exists = any(rule.get("branch_name") == branch for rule in existing)
payload = build_branch_protection_payload(branch, rules)
try:
if exists:
api_request("PATCH", f"/repos/{ORG}/{repo}/branch_protections/{branch}", payload)
@@ -66,8 +53,8 @@ def apply_protection(repo: str, rules: dict) -> bool:
api_request("POST", f"/repos/{ORG}/{repo}/branch_protections", payload)
print(f"{repo}:{branch} synced")
return True
except Exception as exc:
print(f"{repo}:{branch} failed: {exc}")
except Exception as e:
print(f"{repo}:{branch} failed: {e}")
return False
@@ -75,18 +62,15 @@ def main() -> int:
if not GITEA_TOKEN:
print("ERROR: GITEA_TOKEN not set")
return 1
if not CONFIG_DIR.exists():
print(f"ERROR: config directory not found: {CONFIG_DIR}")
return 1
ok = 0
for cfg_path in sorted(CONFIG_DIR.glob("*.yml")):
repo = cfg_path.stem
with cfg_path.open() as fh:
cfg = yaml.safe_load(fh) or {}
rules = cfg.get("rules", {})
rules.setdefault("branch", cfg.get("branch", "main"))
if apply_protection(repo, rules):
for fname in os.listdir(CONFIG_DIR):
if not fname.endswith(".yml"):
continue
repo = fname[:-4]
with open(os.path.join(CONFIG_DIR, fname)) as f:
cfg = yaml.safe_load(f)
if apply_protection(repo, cfg.get("rules", {})):
ok += 1
print(f"\nSynced {ok} repo(s)")

View File

@@ -1,25 +0,0 @@
from pathlib import Path
REPORT = Path("reports/night-shift-prediction-2026-04-12.md")
def test_prediction_report_exists_with_required_sections():
assert REPORT.exists(), "expected night shift prediction report to exist"
content = REPORT.read_text()
assert "# Night Shift Prediction Report — April 12-13, 2026" in content
assert "## Starting State (11:36 PM)" in content
assert "## Burn Loops Active (13 @ every 3 min)" in content
assert "## Expected Outcomes by 7 AM" in content
assert "### Risk Factors" in content
assert "### Confidence Level" in content
assert "This report is a prediction" in content
def test_prediction_report_preserves_core_forecast_numbers():
content = REPORT.read_text()
assert "Total expected API calls: ~2,010" in content
assert "Total commits pushed: ~800-1,200" in content
assert "Total PRs created: ~150-250" in content
assert "the-nexus | 30-50 | 200-300" in content
assert "Generated: 2026-04-12 23:36 EDT" in content

View File

@@ -1,45 +0,0 @@
from __future__ import annotations
import importlib.util
import sys
from pathlib import Path
import yaml
PROJECT_ROOT = Path(__file__).parent.parent
_spec = importlib.util.spec_from_file_location(
"sync_branch_protection_test",
PROJECT_ROOT / "scripts" / "sync_branch_protection.py",
)
_mod = importlib.util.module_from_spec(_spec)
sys.modules["sync_branch_protection_test"] = _mod
_spec.loader.exec_module(_mod)
build_branch_protection_payload = _mod.build_branch_protection_payload
def test_build_branch_protection_payload_enables_rebase_before_merge():
payload = build_branch_protection_payload(
"main",
{
"required_approvals": 1,
"dismiss_stale_approvals": True,
"require_ci_to_merge": False,
"block_deletions": True,
"block_force_push": True,
"block_on_outdated_branch": True,
},
)
assert payload["branch_name"] == "main"
assert payload["rule_name"] == "main"
assert payload["block_on_outdated_branch"] is True
assert payload["required_approvals"] == 1
assert payload["enable_status_check"] is False
def test_the_nexus_branch_protection_config_requires_up_to_date_branch():
config = yaml.safe_load((PROJECT_ROOT / ".gitea" / "branch-protection" / "the-nexus.yml").read_text())
rules = config["rules"]
assert rules["block_on_outdated_branch"] is True