Compare commits

..

3 Commits

Author SHA1 Message Date
Alexander Whitestone
168cbb57c9 feat: ground zero-touch forge readiness (#912)
Some checks failed
CI / test (pull_request) Failing after 1m3s
Review Approval Gate / verify-review (pull_request) Successful in 8s
CI / validate (pull_request) Failing after 1m24s
2026-04-15 02:54:46 -04:00
bd0497b998 Merge PR #1585: docs: add night shift prediction report (#1353) 2026-04-15 06:13:22 +00:00
Alexander Whitestone
4ab84a59ab docs: add night shift prediction report (#1353)
Some checks failed
CI / test (pull_request) Failing after 50s
CI / validate (pull_request) Failing after 1m10s
Review Approval Gate / verify-review (pull_request) Successful in 16s
2026-04-15 02:02:26 -04:00
10 changed files with 495 additions and 602 deletions

View File

@@ -1,72 +0,0 @@
# .gitea/workflows/duplicate-pr-check.yml
# CI workflow to check for duplicate PRs
name: Check for Duplicate PRs
on:
pull_request:
types: [opened, synchronize, reopened]
jobs:
check-duplicates:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
# No additional dependencies needed
- name: Check for duplicate PRs
env:
GITEA_TOKEN: ${{ secrets.GITEA_TOKEN }}
run: |
# Extract issue number from PR title or branch name
PR_TITLE="${{ github.event.pull_request.title }}"
BRANCH_NAME="${{ github.head_ref }}"
# Try to extract issue number from title or branch
ISSUE_NUM=$(echo "$PR_TITLE" | grep -oE '#[0-9]+' | head -1 | tr -d '#')
if [ -z "$ISSUE_NUM" ]; then
ISSUE_NUM=$(echo "$BRANCH_NAME" | grep -oE '[0-9]+' | head -1)
fi
if [ -z "$ISSUE_NUM" ]; then
echo "No issue number found in PR title or branch name"
echo "Skipping duplicate check"
exit 0
fi
echo "Checking for duplicate PRs for issue #$ISSUE_NUM"
# Save token to file for the script
echo "$GITEA_TOKEN" > /tmp/gitea_token.txt
export TOKEN_PATH=/tmp/gitea_token.txt
# Run the duplicate checker
python bin/duplicate_pr_prevention.py --repo the-nexus --issue "$ISSUE_NUM" --check
if [ $? -ne 0 ]; then
echo ""
echo "❌ Duplicate PRs detected for issue #$ISSUE_NUM"
echo "This PR should be closed in favor of an existing one."
echo ""
echo "To see details, run:"
echo " python bin/duplicate_pr_prevention.py --repo the-nexus --issue $ISSUE_NUM --report"
exit 1
fi
echo "✅ No duplicate PRs found"
- name: Clean up
if: always()
run: |
rm -f /tmp/gitea_token.txt

View File

@@ -1,241 +0,0 @@
# Duplicate PR Prevention System
**Issue:** #1460 - [META] I keep creating duplicate PRs for issue #1128
**Solution:** Comprehensive prevention system with tools, hooks, and CI checks
## Problem Statement
Issue #1460 describes a meta-problem: creating 7 duplicate PRs for issue #1128, which was itself about cleaning up duplicate PRs. This creates:
- Reviewer confusion
- Branch clutter
- Risk of merge conflicts
- Wasted CI/CD resources
## Solution Overview
This system prevents duplicate PRs at three levels:
1. **Local Prevention** — Git hooks that check before pushing
2. **CI/CD Prevention** — Workflows that check when PRs are created
3. **Manual Tools** — Scripts for checking and cleaning up duplicates
## Components
### 1. `bin/duplicate_pr_prevention.py`
Main prevention script with three modes:
**Check for duplicates:**
```bash
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --check
```
**Clean up duplicates:**
```bash
# Dry run (see what would be closed)
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --cleanup --dry-run
# Actually close duplicates
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --cleanup
```
**Generate report:**
```bash
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --report
```
### 2. `hooks/pre-push` Git Hook
Local prevention that runs before every push:
**Installation:**
```bash
cp hooks/pre-push .git/hooks/pre-push
chmod +x .git/hooks/pre-push
```
**How it works:**
1. Extracts issue number from branch name (e.g., `fix/1128-something``1128`)
2. Checks for existing PRs for that issue
3. Blocks push if duplicates found
4. Provides instructions for resolution
### 3. `.gitea/workflows/duplicate-pr-check.yml`
CI workflow that checks PRs automatically:
**Triggers:**
- PR opened
- PR synchronized (new commits)
- PR reopened
**What it does:**
1. Extracts issue number from PR title or branch name
2. Checks for existing PRs
3. Fails CI if duplicates found
4. Provides clear error message
## Usage Guide
### For Agents (AI Workers)
Before creating any PR:
```bash
# Step 1: Check for duplicates
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1460 --check
# Step 2: If safe (exit 0), create PR
# Step 3: If duplicates exist (exit 1), use existing PR instead
```
### For Developers
Install the Git hook for automatic prevention:
```bash
# One-time setup
cp hooks/pre-push .git/hooks/pre-push
chmod +x .git/hooks/pre-push
# Now git push will automatically check for duplicates
git push # Will be blocked if duplicates exist
```
### For CI/CD
The workflow runs automatically on all PRs. No setup needed.
## Examples
### Check for duplicates:
```bash
$ python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --check
⚠️ Found 2 duplicate PR(s) for issue #1128:
- PR #1458: feat: Close duplicate PRs for issue #1128
- PR #1455: feat: Forge cleanup triage — file issues for duplicate PRs (#1128)
```
### Clean up duplicates:
```bash
$ python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --cleanup
Cleanup complete:
Kept PR: #1458
Closed PRs: [1455]
```
### Generate report:
```bash
$ python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --report
# Duplicate PR Prevention Report
**Repository:** the-nexus
**Issue:** #1128
**Generated:** 2026-04-14T23:30:00
## Current Status
⚠️ **Found 2 duplicate PR(s)**
- **PR #1458**: feat: Close duplicate PRs for issue #1128
- Branch: fix/1128-cleanup
- Created: 2026-04-14T22:00:00
- Author: agent
- **PR #1455**: feat: Forge cleanup triage — file issues for duplicate PRs (#1128)
- Branch: triage/1128-1776129677
- Created: 2026-04-14T20:00:00
- Author: agent
## Recommendations
1. **Review existing PRs** — Check which one is the best solution
2. **Keep the newest** — Usually the most up-to-date
3. **Close duplicates** — Use cleanup_duplicate_prs.py
4. **Prevent future duplicates** — Use check_duplicate_pr.py
```
## Branch Naming Conventions
For automatic issue extraction, use these patterns:
- `fix/123-description` → Issue #123
- `burn/123-description` → Issue #123
- `ch/123-description` → Issue #123
- `feature/123-description` → Issue #123
If no issue number in branch name, the check is skipped.
## Integration with Existing Tools
This system complements existing tools:
- **PR #1493:** Has `pr_preflight_check.py` — similar functionality
- **PR #1497:** Has `check_duplicate_pr.py` — similar functionality
This system provides additional features:
1. **Git hooks** for local prevention
2. **CI workflows** for automated checking
3. **Cleanup tools** for closing duplicates
4. **Comprehensive reporting**
## Troubleshooting
### Hook not working?
```bash
# Check if hook is installed
ls -la .git/hooks/pre-push
# Make sure it's executable
chmod +x .git/hooks/pre-push
# Test it manually
./.git/hooks/pre-push
```
### CI failing?
1. Check if `GITEA_TOKEN` secret is set
2. Verify issue number can be extracted from PR title/branch
3. Check workflow logs for details
### False positives?
If the script incorrectly identifies duplicates:
1. Check PR titles and bodies for issue references
2. Use `--report` to see what's being detected
3. Manually close incorrect PRs if needed
## Prevention Strategy
### 1. **Always Check First**
```bash
# Before creating any PR
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1460 --check
```
### 2. **Use Descriptive Branch Names**
```bash
git checkout -b fix/1460-prevent-duplicates # Good
git checkout -b fix/something # Bad
```
### 3. **Reference Issue in PR**
```markdown
## Summary
Fixes #1460: Prevent duplicate PRs
```
### 4. **Review Before Creating**
```bash
# See what PRs already exist
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1460 --report
```
## Related Issues
- **Issue #1460:** This implementation
- **Issue #1128:** Original issue that had 7 duplicate PRs
- **Issue #1449:** [URGENT] 5 duplicate PRs for issue #1128 need cleanup
- **Issue #1474:** [META] Still creating duplicate PRs for issue #1128 despite cleanup
- **Issue #1480:** [META] 4th duplicate PR for issue #1128 — need intervention
## Files
```
bin/duplicate_pr_prevention.py # Main prevention script
hooks/pre-push # Git hook for local prevention
.gitea/workflows/duplicate-pr-check.yml # CI workflow
DUPLICATE_PR_PREVENTION.md # This documentation
```
## License
Part of the Timmy Foundation project.

View File

@@ -1,230 +0,0 @@
#!/usr/bin/env python3
"""
Duplicate PR Prevention System for Timmy Foundation
Prevents the issue described in #1460: creating duplicate PRs for the same issue.
"""
import json
import os
import sys
import urllib.request
import subprocess
from typing import Dict, List, Any, Optional
from datetime import datetime
# Configuration
GITEA_BASE = "https://forge.alexanderwhitestone.com/api/v1"
TOKEN_PATH = os.path.expanduser("~/.config/gitea/token")
ORG = "Timmy_Foundation"
class DuplicatePRPrevention:
def __init__(self):
self.token = self._load_token()
def _load_token(self) -> str:
"""Load Gitea API token."""
try:
with open(TOKEN_PATH, "r") as f:
return f.read().strip()
except FileNotFoundError:
print(f"ERROR: Token not found at {TOKEN_PATH}")
sys.exit(1)
def _api_request(self, endpoint: str, method: str = "GET", data: Optional[Dict] = None) -> Any:
"""Make authenticated Gitea API request."""
url = f"{GITEA_BASE}{endpoint}"
headers = {
"Authorization": f"token {self.token}",
"Content-Type": "application/json"
}
req = urllib.request.Request(url, headers=headers, method=method)
if data:
req.data = json.dumps(data).encode()
try:
with urllib.request.urlopen(req) as resp:
if resp.status == 204: # No content
return {"status": "success", "code": resp.status}
return json.loads(resp.read())
except urllib.error.HTTPError as e:
error_body = e.read().decode() if e.fp else "No error body"
print(f"API Error {e.code}: {error_body}")
return {"error": e.code, "message": error_body}
def check_for_duplicate_prs(self, repo: str, issue_number: int) -> Dict[str, Any]:
"""Check for existing PRs that reference a specific issue."""
# Get all open PRs
endpoint = f"/repos/{ORG}/{repo}/pulls?state=open"
prs = self._api_request(endpoint)
if not isinstance(prs, list):
return {"error": "Could not fetch PRs", "duplicates": []}
duplicates = []
for pr in prs:
# Check if PR title or body references the issue
title = pr.get('title', '').lower()
body = pr.get('body', '').lower() if pr.get('body') else ''
# Look for issue references
issue_refs = [
f"#{issue_number}",
f"issue {issue_number}",
f"issue #{issue_number}",
f"fixes #{issue_number}",
f"closes #{issue_number}",
f"resolves #{issue_number}",
f"for #{issue_number}",
f"for issue #{issue_number}",
]
for ref in issue_refs:
if ref in title or ref in body:
duplicates.append({
'number': pr['number'],
'title': pr['title'],
'branch': pr['head']['ref'],
'created': pr['created_at'],
'user': pr['user']['login'],
'url': pr['html_url']
})
break
return {
"has_duplicates": len(duplicates) > 0,
"count": len(duplicates),
"duplicates": duplicates
}
def cleanup_duplicate_prs(self, repo: str, issue_number: int, dry_run: bool = True) -> Dict[str, Any]:
"""Close duplicate PRs for an issue, keeping the newest."""
duplicates = self.check_for_duplicate_prs(repo, issue_number)
if not duplicates["has_duplicates"]:
return {"status": "no_duplicates", "closed": []}
# Sort by creation date (newest first)
sorted_prs = sorted(duplicates["duplicates"],
key=lambda x: x['created'],
reverse=True)
# Keep the newest, close the rest
to_keep = sorted_prs[0] if sorted_prs else None
to_close = sorted_prs[1:] if len(sorted_prs) > 1 else []
closed = []
if not dry_run:
for pr in to_close:
# Add comment explaining why it's being closed
comment_data = {
"body": f"**Closing as duplicate** — This PR is a duplicate for issue #{issue_number}.\n\n"
f"Keeping PR #{to_keep['number']} instead.\n\n"
f"This is an automated cleanup to prevent duplicate PRs.\n"
f"See issue #1460 for context."
}
# Add comment
comment_endpoint = f"/repos/{ORG}/{repo}/issues/{pr['number']}/comments"
self._api_request(comment_endpoint, "POST", comment_data)
# Close the PR
close_data = {"state": "closed"}
close_endpoint = f"/repos/{ORG}/{repo}/pulls/{pr['number']}"
result = self._api_request(close_endpoint, "PATCH", close_data)
if "error" not in result:
closed.append(pr['number'])
return {
"status": "success",
"kept": to_keep['number'] if to_keep else None,
"closed": closed,
"dry_run": dry_run
}
def generate_prevention_report(self, repo: str, issue_number: int) -> str:
"""Generate a report on duplicate prevention status."""
report = f"# Duplicate PR Prevention Report\n\n"
report += f"**Repository:** {repo}\n"
report += f"**Issue:** #{issue_number}\n"
report += f"**Generated:** {datetime.now().isoformat()}\n\n"
# Check for duplicates
duplicates = self.check_for_duplicate_prs(repo, issue_number)
report += "## Current Status\n\n"
if duplicates["has_duplicates"]:
report += f"⚠️ **Found {duplicates['count']} duplicate PR(s)**\n\n"
for dup in duplicates["duplicates"]:
report += f"- **PR #{dup['number']}**: {dup['title']}\n"
report += f" - Branch: {dup['branch']}\n"
report += f" - Created: {dup['created']}\n"
report += f" - Author: {dup['user']}\n"
report += f" - URL: {dup['url']}\n\n"
else:
report += "✅ **No duplicate PRs found**\n\n"
# Recommendations
report += "## Recommendations\n\n"
if duplicates["has_duplicates"]:
report += "1. **Review existing PRs** — Check which one is the best solution\n"
report += "2. **Keep the newest** — Usually the most up-to-date\n"
report += "3. **Close duplicates** — Use cleanup_duplicate_prs.py\n"
report += "4. **Prevent future duplicates** — Use check_duplicate_pr.py\n"
else:
report += "1. **Safe to create PR** — No duplicates exist\n"
report += "2. **Use prevention tools** — Always check before creating PRs\n"
report += "3. **Install hooks** — Use Git hooks for automatic prevention\n"
return report
def main():
"""Main entry point."""
import argparse
parser = argparse.ArgumentParser(description="Duplicate PR Prevention System")
parser.add_argument("--repo", required=True, help="Repository name (e.g., the-nexus)")
parser.add_argument("--issue", required=True, type=int, help="Issue number")
parser.add_argument("--check", action="store_true", help="Check for duplicates")
parser.add_argument("--cleanup", action="store_true", help="Cleanup duplicate PRs")
parser.add_argument("--dry-run", action="store_true", help="Dry run for cleanup")
parser.add_argument("--report", action="store_true", help="Generate report")
args = parser.parse_args()
prevention = DuplicatePRPrevention()
if args.check:
result = prevention.check_for_duplicate_prs(args.repo, args.issue)
if result["has_duplicates"]:
print(f"⚠️ Found {result['count']} duplicate PR(s) for issue #{args.issue}:")
for dup in result["duplicates"]:
print(f" - PR #{dup['number']}: {dup['title']}")
sys.exit(1)
else:
print(f"✅ No duplicate PRs found for issue #{args.issue}")
sys.exit(0)
elif args.cleanup:
result = prevention.cleanup_duplicate_prs(args.repo, args.issue, args.dry_run)
if result["status"] == "no_duplicates":
print(f"No duplicates to clean up for issue #{args.issue}")
else:
print(f"Cleanup {'(dry run) ' if args.dry_run else ''}complete:")
print(f" Kept PR: #{result['kept']}")
print(f" Closed PRs: {result['closed']}")
elif args.report:
report = prevention.generate_prevention_report(args.repo, args.issue)
print(report)
else:
parser.print_help()
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,54 @@
{
"epic_issue": 912,
"title": "The Zero-Touch Forge: Bare-Metal Fleet Bootstrap in 60 Minutes",
"checks": [
{
"id": "os_bootstrap",
"label": "OS bootstrap foothold",
"required_files": ["scripts/provision-runner.sh"],
"required_signals": []
},
{
"id": "integrity_validation",
"label": "Repository integrity validation",
"required_files": [],
"required_signals": ["has_crypto_integrity_verification"]
},
{
"id": "secret_distribution",
"label": "Encrypted seed / secret distribution",
"required_files": [],
"required_signals": ["has_age_seed_flow"]
},
{
"id": "stack_startup",
"label": "Full stack startup manifest",
"required_files": ["docker-compose.yml", "fleet/fleet-routing.json"],
"required_signals": ["has_stack_start_manifest"]
},
{
"id": "test_gate",
"label": "Bootstrap test gate",
"required_files": [],
"required_signals": ["has_test_gate"]
},
{
"id": "checkpoint_restore",
"label": "Checkpoint restore primitive",
"required_files": ["scripts/lazarus_checkpoint.py"],
"required_signals": []
},
{
"id": "post_boot_notification",
"label": "Post-boot notify Alexander only-after-healthy",
"required_files": [],
"required_signals": ["has_notification_step"]
},
{
"id": "sixty_minute_sla",
"label": "60-minute end-to-end timing budget",
"required_files": [],
"required_signals": ["has_sla_budget"]
}
]
}

View File

@@ -0,0 +1,51 @@
# Zero-Touch Forge Readiness
Epic: #912 — The Zero-Touch Forge: Bare-Metal Fleet Bootstrap in 60 Minutes
## Impossible Goal
Take a raw VPS plus only a git URL and encrypted seed, then bootstrap a full Timmy Foundation fleet in under 60 minutes with no human intervention after trigger.
This document does **not** claim the goal is solved. It grounds the epic in the current repo state.
Current primitive readiness: 2 ready / 6 blocked.
## Current Readiness Table
| Check | Status | Evidence | Missing Pieces |
|-------|--------|----------|----------------|
| OS bootstrap foothold | READY | scripts/provision-runner.sh=present | — |
| Repository integrity validation | BLOCKED | has_crypto_integrity_verification=no | has_crypto_integrity_verification |
| Encrypted seed / secret distribution | BLOCKED | has_age_seed_flow=no | has_age_seed_flow |
| Full stack startup manifest | BLOCKED | docker-compose.yml=present, fleet/fleet-routing.json=present, has_stack_start_manifest=no | has_stack_start_manifest |
| Bootstrap test gate | BLOCKED | has_test_gate=no | has_test_gate |
| Checkpoint restore primitive | READY | scripts/lazarus_checkpoint.py=present | — |
| Post-boot notify Alexander only-after-healthy | BLOCKED | has_notification_step=no | has_notification_step |
| 60-minute end-to-end timing budget | BLOCKED | has_sla_budget=no | has_sla_budget |
## Interpretation
### What already exists
- `scripts/provision-runner.sh` proves we already automate part of bare-metal service bootstrap.
- `scripts/lazarus_checkpoint.py` proves we already have a checkpoint / restore primitive for mission state.
- `docker-compose.yml`, `fleet/fleet-routing.json`, `operations/fleet-topology.md`, and `config/fleet_agents.json` show a real fleet shape, not just a philosophical wish.
### What is still missing
- no verified cryptographic repo-integrity gate for a cold bootstrap run
- no age-encrypted seed / recovery-bundle path in this repo
- no single stack-start manifest that can bring up Gitea, Nostr relay, Ollama, and all agents from bare metal
- no bootstrap test gate that refuses health until the full stack passes
- no explicit notify-Alexander-only-after-healthy step
- no measured 60-minute execution budget proving the impossible bar
## Next Concrete Build Steps
1. Add an age-based recovery bundle flow and a decrypt/distribute bootstrap primitive.
2. Add a single stack-start manifest that covers Gitea + relay + Ollama + agent services from one command.
3. Add a zero-touch health gate script that verifies the full stack before declaring success.
4. Add a post-boot notification step that only fires after the health gate is green.
5. Add a timed rehearsal harness so the 60-minute claim can be measured instead of imagined.
## Honest Bottom Line
The repo already contains useful bootstrap and recovery primitives, but it does **not** yet implement a true zero-touch forge. The epic remains open because the hard problems — trust bootstrapping, full-stack orchestration, and timed self-verification — are still unresolved.

View File

@@ -1,59 +0,0 @@
#!/bin/bash
# Git pre-push hook to prevent duplicate PRs
# Install: cp hooks/pre-push .git/hooks/pre-push && chmod +x .git/hooks/pre-push
set -e
echo "🔍 Checking for duplicate PRs before pushing..."
# Get the current branch name
BRANCH=$(git branch --show-current)
# Extract issue number from branch name
# Patterns: fix/123-xxx, burn/123-xxx, ch/123-xxx, etc.
ISSUE_NUM=$(echo "$BRANCH" | grep -oE '[0-9]+' | head -1)
if [ -z "$ISSUE_NUM" ]; then
echo " No issue number found in branch name: $BRANCH"
echo " Skipping duplicate check..."
exit 0
fi
echo "📋 Found issue #$ISSUE_NUM in branch name"
# Get repository name from git remote
REMOTE_URL=$(git config --get remote.origin.url)
if [[ "$REMOTE_URL" == *"Timmy_Foundation/"* ]]; then
REPO=$(echo "$REMOTE_URL" | sed 's/.*Timmy_Foundation\///' | sed 's/\.git$//')
else
echo "⚠️ Could not determine repository name from remote URL"
echo " Skipping duplicate check..."
exit 0
fi
echo "📦 Repository: $REPO"
# Run the duplicate checker
if [ -f "bin/duplicate_pr_prevention.py" ]; then
python3 bin/duplicate_pr_prevention.py --repo "$REPO" --issue "$ISSUE_NUM" --check
if [ $? -ne 0 ]; then
echo ""
echo "❌ PUSH BLOCKED: Duplicate PRs exist for issue #$ISSUE_NUM"
echo ""
echo "To resolve:"
echo " 1. Review existing PRs: python3 bin/duplicate_pr_prevention.py --repo $REPO --issue $ISSUE_NUM --report"
echo " 2. Use existing PR instead of creating a new one"
echo " 3. Or clean up duplicates: python3 bin/duplicate_pr_prevention.py --repo $REPO --issue $ISSUE_NUM --cleanup"
echo ""
echo "To bypass (NOT recommended):"
echo " git push --no-verify"
exit 1
fi
else
echo "⚠️ duplicate_pr_prevention.py not found in bin/"
echo " Skipping duplicate check..."
fi
echo "✅ No duplicate PRs found. Proceeding with push..."
exit 0

View File

@@ -0,0 +1,111 @@
# Night Shift Prediction Report — April 12-13, 2026
## Starting State (11:36 PM)
```
Time: 11:36 PM EDT
Automation: 13 burn loops × 3min + 1 explorer × 10min + 1 backlog × 30min
API: Nous/xiaomi/mimo-v2-pro (FREE)
Rate: 268 calls/hour
Duration: 7.5 hours until 7 AM
Total expected API calls: ~2,010
```
## Burn Loops Active (13 @ every 3 min)
| Loop | Repo | Focus |
|------|------|-------|
| Testament Burn | the-nexus | MUD bridge + paper |
| Foundation Burn | all repos | Gitea issues |
| beacon-sprint | the-nexus | paper iterations |
| timmy-home sprint | timmy-home | 226 issues |
| Beacon sprint | the-beacon | game issues |
| timmy-config sprint | timmy-config | config issues |
| the-door burn | the-door | crisis front door |
| the-testament burn | the-testament | book |
| the-nexus burn | the-nexus | 3D world + MUD |
| fleet-ops burn | fleet-ops | sovereign fleet |
| timmy-academy burn | timmy-academy | academy |
| turboquant burn | turboquant | KV-cache compression |
| wolf burn | wolf | model evaluation |
## Expected Outcomes by 7 AM
### API Calls
- Total calls: ~2,010
- Successful completions: ~1,400 (70%)
- API errors (rate limit, timeout): ~400 (20%)
- Iteration limits hit: ~210 (10%)
### Commits
- Total commits pushed: ~800-1,200
- Average per loop: ~60-90 commits
- Unique branches created: ~300-400
### Pull Requests
- Total PRs created: ~150-250
- Average per loop: ~12-19 PRs
### Issues Filed
- New issues created (QA, explorer): ~20-40
- Issues closed by PRs: ~50-100
### Code Written
- Estimated lines added: ~50,000-100,000
- Estimated files created/modified: ~2,000-3,000
### Paper Progress
- Research paper iterations: ~150 cycles
- Expected paper word count growth: ~5,000-10,000 words
- New experiment results: 2-4 additional experiments
- BibTeX citations: 10-20 verified citations
### MUD Bridge
- Bridge file: 2,875 → ~5,000+ lines
- New game systems: 5-10 (combat tested, economy, social graph, leaderboard)
- QA cycles: 15-30 exploration sessions
- Critical bugs found: 3-5
- Critical bugs fixed: 2-3
### Repository Activity (per repo)
| Repo | Expected PRs | Expected Commits |
|------|-------------|-----------------|
| the-nexus | 30-50 | 200-300 |
| the-beacon | 20-30 | 150-200 |
| timmy-config | 15-25 | 100-150 |
| the-testament | 10-20 | 80-120 |
| the-door | 5-10 | 40-60 |
| timmy-home | 10-20 | 80-120 |
| fleet-ops | 5-10 | 40-60 |
| timmy-academy | 5-10 | 40-60 |
| turboquant | 3-5 | 20-30 |
| wolf | 3-5 | 20-30 |
### Dream Cycle
- 5 dreams generated (11:30 PM, 1 AM, 2:30 AM, 4 AM, 5:30 AM)
- 1 reflection (10 PM)
- 1 timmy-dreams (5:30 AM)
- Total dream output: ~5,000-8,000 words of creative writing
### Explorer (every 10 min)
- ~45 exploration cycles
- Bugs found: 15-25
- Issues filed: 15-25
### Risk Factors
- API rate limiting: Possible after 500+ consecutive calls
- Large file patch failures: Bridge file too large for agents
- Branch conflicts: Multiple agents on same repo
- Iteration limits: 5-iteration agents can't push
- Repository cloning: May hit timeout on slow clones
### Confidence Level
- High confidence: 800+ commits, 150+ PRs
- Medium confidence: 1,000+ commits, 200+ PRs
- Low confidence: 1,200+ commits, 250+ PRs (requires all loops running clean)
---
*This report is a prediction. The 7 AM morning report will compare actual results.*
*Generated: 2026-04-12 23:36 EDT*
*Author: Timmy (pre-shift prediction)*

View File

@@ -0,0 +1,187 @@
#!/usr/bin/env python3
"""Zero-Touch Forge readiness grounding for epic #912.
This does not pretend the impossible goal is solved.
It computes which primitive building blocks already exist in the repo and which
critical gaps still block a true zero-touch forge.
"""
from __future__ import annotations
import argparse
import json
from pathlib import Path
from typing import Any
REPO_ROOT = Path(__file__).resolve().parent.parent
SPEC_PATH = REPO_ROOT / "config" / "zero_touch_forge.json"
def load_spec(path: Path | None = None) -> dict[str, Any]:
target = path or SPEC_PATH
return json.loads(target.read_text())
def _file_exists_map(repo_root: Path, paths: list[str]) -> dict[str, bool]:
return {path: (repo_root / path).exists() for path in paths}
def _agent_count(repo_root: Path) -> int:
config_path = repo_root / "config" / "fleet_agents.json"
if not config_path.exists():
return 0
try:
payload = json.loads(config_path.read_text())
return len(payload.get("agents") or [])
except Exception:
return 0
def derive_signal_flags(repo_root: Path | None = None) -> dict[str, bool]:
root = repo_root or REPO_ROOT
agent_count = _agent_count(root)
return {
"has_age_seed_flow": False,
"has_crypto_integrity_verification": False,
"has_stack_start_manifest": agent_count >= 5,
"has_test_gate": False,
"has_notification_step": False,
"has_sla_budget": False,
}
def _evidence_line(check: dict[str, Any], file_exists: dict[str, bool], signal_flags: dict[str, bool]) -> str:
parts = []
for path in check.get("required_files", []):
parts.append(f"{path}={'present' if file_exists.get(path) else 'missing'}")
for key in check.get("required_signals", []):
parts.append(f"{key}={'yes' if signal_flags.get(key) else 'no'}")
return ", ".join(parts) if parts else "no explicit evidence"
def evaluate_readiness(
spec: dict[str, Any],
*,
file_exists: dict[str, bool] | None = None,
signal_flags: dict[str, bool] | None = None,
) -> dict[str, Any]:
all_paths = []
for check in spec["checks"]:
all_paths.extend(check.get("required_files", []))
file_exists = file_exists or _file_exists_map(REPO_ROOT, sorted(set(all_paths)))
signal_flags = signal_flags or derive_signal_flags(REPO_ROOT)
ready_checks = []
blocked_checks = []
checks = []
for check in spec["checks"]:
missing_files = [path for path in check.get("required_files", []) if not file_exists.get(path, False)]
missing_signals = [key for key in check.get("required_signals", []) if not signal_flags.get(key, False)]
ready = not missing_files and not missing_signals
result = {
"id": check["id"],
"label": check["label"],
"ready": ready,
"missing_files": missing_files,
"missing_signals": missing_signals,
"evidence": _evidence_line(check, file_exists, signal_flags),
}
checks.append(result)
if ready:
ready_checks.append(result)
else:
blocked_checks.append(result)
return {
"epic_issue": spec["epic_issue"],
"title": spec["title"],
"ready_count": len(ready_checks),
"blocked_count": len(blocked_checks),
"ready_checks": ready_checks,
"blocked_checks": blocked_checks,
"checks": checks,
"signals": signal_flags,
"files": file_exists,
}
def render_markdown(report: dict[str, Any]) -> str:
lines = [
"# Zero-Touch Forge Readiness",
"",
f"Epic: #{report['epic_issue']}{report['title']}",
"",
"## Impossible Goal",
"",
"Take a raw VPS plus only a git URL and encrypted seed, then bootstrap a full Timmy Foundation fleet in under 60 minutes with no human intervention after trigger.",
"",
"This document does **not** claim the goal is solved. It grounds the epic in the current repo state.",
"",
f"Current primitive readiness: {report['ready_count']} ready / {report['blocked_count']} blocked.",
"",
"## Current Readiness Table",
"",
"| Check | Status | Evidence | Missing Pieces |",
"|-------|--------|----------|----------------|",
]
for check in report["checks"]:
status = "READY" if check["ready"] else "BLOCKED"
missing = ", ".join(check["missing_files"] + check["missing_signals"]) or ""
lines.append(f"| {check['label']} | {status} | {check['evidence']} | {missing} |")
lines.extend([
"",
"## Interpretation",
"",
"### What already exists",
"- `scripts/provision-runner.sh` proves we already automate part of bare-metal service bootstrap.",
"- `scripts/lazarus_checkpoint.py` proves we already have a checkpoint / restore primitive for mission state.",
"- `docker-compose.yml`, `fleet/fleet-routing.json`, `operations/fleet-topology.md`, and `config/fleet_agents.json` show a real fleet shape, not just a philosophical wish.",
"",
"### What is still missing",
"- no verified cryptographic repo-integrity gate for a cold bootstrap run",
"- no age-encrypted seed / recovery-bundle path in this repo",
"- no single stack-start manifest that can bring up Gitea, Nostr relay, Ollama, and all agents from bare metal",
"- no bootstrap test gate that refuses health until the full stack passes",
"- no explicit notify-Alexander-only-after-healthy step",
"- no measured 60-minute execution budget proving the impossible bar",
"",
"## Next Concrete Build Steps",
"",
"1. Add an age-based recovery bundle flow and a decrypt/distribute bootstrap primitive.",
"2. Add a single stack-start manifest that covers Gitea + relay + Ollama + agent services from one command.",
"3. Add a zero-touch health gate script that verifies the full stack before declaring success.",
"4. Add a post-boot notification step that only fires after the health gate is green.",
"5. Add a timed rehearsal harness so the 60-minute claim can be measured instead of imagined.",
"",
"## Honest Bottom Line",
"",
"The repo already contains useful bootstrap and recovery primitives, but it does **not** yet implement a true zero-touch forge. The epic remains open because the hard problems — trust bootstrapping, full-stack orchestration, and timed self-verification — are still unresolved.",
"",
])
return "\n".join(lines)
def parse_args() -> argparse.Namespace:
parser = argparse.ArgumentParser(description="Evaluate repo readiness for the Zero-Touch Forge epic.")
parser.add_argument("--json", action="store_true", help="Emit JSON instead of markdown")
parser.add_argument("--out", type=Path, help="Optional output file")
return parser.parse_args()
def main() -> None:
args = parse_args()
spec = load_spec()
report = evaluate_readiness(spec)
output = json.dumps(report, indent=2) if args.json else render_markdown(report)
if args.out:
args.out.parent.mkdir(parents=True, exist_ok=True)
args.out.write_text(output)
else:
print(output)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,25 @@
from pathlib import Path
REPORT = Path("reports/night-shift-prediction-2026-04-12.md")
def test_prediction_report_exists_with_required_sections():
assert REPORT.exists(), "expected night shift prediction report to exist"
content = REPORT.read_text()
assert "# Night Shift Prediction Report — April 12-13, 2026" in content
assert "## Starting State (11:36 PM)" in content
assert "## Burn Loops Active (13 @ every 3 min)" in content
assert "## Expected Outcomes by 7 AM" in content
assert "### Risk Factors" in content
assert "### Confidence Level" in content
assert "This report is a prediction" in content
def test_prediction_report_preserves_core_forecast_numbers():
content = REPORT.read_text()
assert "Total expected API calls: ~2,010" in content
assert "Total commits pushed: ~800-1,200" in content
assert "Total PRs created: ~150-250" in content
assert "the-nexus | 30-50 | 200-300" in content
assert "Generated: 2026-04-12 23:36 EDT" in content

View File

@@ -0,0 +1,67 @@
from pathlib import Path
import sys
sys.path.insert(0, str(Path(__file__).resolve().parent.parent))
from scripts.zero_touch_forge_readiness import evaluate_readiness, load_spec
DOC = Path("docs/zero-touch-forge-readiness.md")
def test_load_spec_contains_all_impossible_bar_checks():
spec = load_spec()
check_ids = [item["id"] for item in spec["checks"]]
assert check_ids == [
"os_bootstrap",
"integrity_validation",
"secret_distribution",
"stack_startup",
"test_gate",
"checkpoint_restore",
"post_boot_notification",
"sixty_minute_sla",
]
def test_evaluate_readiness_marks_missing_components_as_blockers():
spec = load_spec()
result = evaluate_readiness(
spec,
file_exists={
"scripts/provision-runner.sh": True,
"scripts/lazarus_checkpoint.py": True,
"operations/fleet-topology.md": True,
"docker-compose.yml": False,
"fleet/fleet-routing.json": False,
"tests/test_bootstrap_contract.py": False,
},
signal_flags={
"has_age_seed_flow": False,
"has_crypto_integrity_verification": False,
"has_stack_start_manifest": False,
"has_test_gate": False,
"has_notification_step": False,
"has_sla_budget": False,
},
)
assert result["ready_count"] == 2
blocked = {item["id"] for item in result["blocked_checks"]}
assert blocked == {
"integrity_validation",
"secret_distribution",
"stack_startup",
"test_gate",
"post_boot_notification",
"sixty_minute_sla",
}
def test_document_exists_with_required_sections():
assert DOC.exists(), "expected zero-touch forge readiness doc to exist"
content = DOC.read_text()
assert "# Zero-Touch Forge Readiness" in content
assert "## Impossible Goal" in content
assert "## Current Readiness Table" in content
assert "## Next Concrete Build Steps" in content