Compare commits

..

1 Commits

Author SHA1 Message Date
Alexander Whitestone
c6a34db91d docs: capture forge-wide QA pass (#1333)
Some checks failed
CI / test (pull_request) Failing after 1m1s
Review Approval Gate / verify-review (pull_request) Successful in 11s
CI / validate (pull_request) Failing after 2m1s
2026-04-15 01:22:36 -04:00
5 changed files with 249 additions and 231 deletions

View File

@@ -1,72 +0,0 @@
# Duplicate PR Prevention
## Problem
The burn loop creates duplicate PRs for the same issue because it doesn't check for existing PRs before creating new ones.
## Solution
Two scripts:
### 1. Preflight Check (`scripts/preflight-pr-check.sh`)
Run BEFORE creating a PR:
```bash
./scripts/preflight-pr-check.sh 1128
```
Output if PRs exist:
```
🚫 BLOCKED: 2 existing PR(s) for issue #1128
Existing PRs:
#1458: feat: Close duplicate PRs for issue #1128
Branch: dawn/1128-1776130053
URL: https://...
Options:
1. Review and merge an existing PR
2. Close duplicates and proceed
3. Use --force to bypass (NOT RECOMMENDED)
```
Exit code 1 = blocked. Exit code 0 = safe to proceed.
### 2. Cleanup Script (`scripts/cleanup-duplicate-prs.sh`)
Close duplicate PRs:
```bash
# Dry run (show what would be closed)
./scripts/cleanup-duplicate-prs.sh 1128
# Actually close duplicates (keeps oldest)
./scripts/cleanup-duplicate-prs.sh 1128 --close
```
## Integration
### In burn loop
Add preflight check before PR creation:
```bash
# Before: git push && curl ... /pulls
./scripts/preflight-pr-check.sh $ISSUE_NUM || exit 1
```
### In CI
Add as a GitHub/Gitea Actions check:
```yaml
- name: Check for duplicate PRs
run: ./scripts/preflight-pr-check.sh ${{ github.event.issue.number }}
```
## Environment Variables
- `GITEA_TOKEN` — API token (default: reads from `~/.config/gitea/token`)
- `GITEA_URL` — Forge URL (default: `https://forge.alexanderwhitestone.com`)
- `GITEA_REPO` — Repository (default: `Timmy_Foundation/the-nexus`)

View File

@@ -0,0 +1,74 @@
# Forge-Wide QA Pass — 2026-04-12 Evening
**Reviewer:** Perplexity
**Scope:** All 6 Timmy Foundation repos
**Source:** Issue #1333 on `Timmy_Foundation/the-nexus`
**Reference:** `perplexity-status-report-2026-04-12-evening`
---
## Summary
| Repo | Open PRs | Reviewed | Approved | Changes Requested | Closed |
|------|----------|----------|----------|-------------------|--------|
| the-nexus | 33 → 32 | 10 | 8 | 1 | 1 |
| timmy-config | 5 | 5 | 3 | 2 | 0 |
| timmy-home | 2 | 2 | 0 | 2 | 0 |
| fleet-ops | 0 | — | — | — | — |
| hermes-agent | 0 | — | — | — | — |
| the-beacon | 0 | 1 post-merge flag | — | — | — |
**Total: 40 open PRs across the org. 17 reviewed this pass.**
---
## Critical Findings
### 1. the-nexus swarm pileup (again)
33 open PRs, 31 from Rockachopa via mimo-v2-pro swarm. These are NOT empty/stale like the April 11 event — they contain real diffs. However:
- **Triple duplicate:** PRs #1319, #1322, #1328 all delete `CONTRIBUTORING.md`. Closed #1322 as duplicate.
- **4 sibling `app.js` PRs** (#1285, #1307, #1330, #1331) branch from the same commit. Merge sequentially or they'll conflict.
- **Queue throttle not deployed yet:** PR #1327 adds `MAX_QUEUE_DEPTH=10` to the dispatcher — the fix for this exact problem. **Merge #1327 first and restart the dispatcher.**
### 2. CAPTCHA bypass tool in timmy-config #499
The multimodal toolsuite PR includes a `captcha_solver.py`. This needs explicit human sign-off — it's a policy decision, not a code decision. Requested changes.
### 3. the-beacon Gemini bloat
PR #76 (merged) added +3,258 lines for two small fixes. Gemini likely rewrote large portions of `game.js`. Also: `game/npc-logic.js` and `scripts/guardrails.js` may be dead code (runtime lives in `js/`). Flagged for audit.
### 4. Paper PRs need polish (timmy-home)
Both papers (#596 Poka-Yoke, #597 Sovereign Fleet) are real work but have specific bugs:
- #596: path injection security bug + broken citation
- #597: real IPs in public-facing tables + wrong LaTeX style
---
## Recommended Merge Order (the-nexus)
1. **#1327** — Queue throttle (stops the pileup)
2. **#1319** — .gitea.yml cleanup
3. **#1326** — Multi-user bridge (Timmy, strong)
4. **#1330** — GOFAI facts (Timmy, clean)
5. **#1285** — Performance gating
6. **#1329** — Watchdog fix
7. **#1331** — Health HUD
8. **#1328** — Portfolio CTA (rebase after #1319)
9. Remaining 23 Rockachopa PRs need individual review
## What's Working Well
- **Timmy's PR quality is excellent.** Both #1330 and #1326 are targeted, complete, well-structured.
- **hermes-agent is clean.** PR #300 (malformed JSON repair, +1 line, saves ~1,400 inference turns) is the best ROI change in the org.
- **fleet-ops GOFIA series** (#80, #81, #82) landed with strong test coverage.
- **the-beacon** shipped 9 PRs with real game features (emotional arcs, procedural sound, golden ratio economics).
## What Needs Attention
- Deploy the queue throttle (#1327) before the next swarm cycle
- Audit `the-beacon/game.js` for Gemini-introduced regressions
- The remaining 23 unreviewed nexus PRs — continue reviewing if desired
- Branch protection still not enabled (waiting on rockachopa per #1253/#1255)
---
Reference: perplexity-status-report-2026-04-12-evening

View File

@@ -1,101 +1,170 @@
#!/usr/bin/env bash
# cleanup-duplicate-prs.sh — Close duplicate PRs for a given issue
# ═══════════════════════════════════════════════════════════════
# cleanup-duplicate-prs.sh — Identify and close duplicate open PRs
#
# This script identifies PRs that are duplicates (same issue number
# or very similar titles) and closes the older ones.
#
# Usage:
# ./scripts/cleanup-duplicate-prs.sh <issue_number> [--close]
# ./scripts/cleanup-duplicate-prs.sh [--dry-run] [--close]
#
# Without --close: dry run (show what would be closed)
# With --close: actually close the duplicates
# Options:
# --dry-run Show what would be done without making changes
# --close Actually close duplicate PRs (default is dry-run)
#
# Designed for issue #1128: Forge Cleanup
# ═══════════════════════════════════════════════════════════════
set -euo pipefail
ISSUE_NUM="${1:?Usage: cleanup-duplicate-prs.sh <issue_number> [--close]}"
CLOSE_MODE="${2:-}"
# ─── Configuration ──────────────────────────────────────────
GITEA_URL="${GITEA_URL:-https://forge.alexanderwhitestone.com}"
GITEA_TOKEN="${GITEA_TOKEN:-$(cat ~/.config/gitea/token 2>/dev/null || echo '')}"
REPO="${GITEA_REPO:-Timmy_Foundation/the-nexus}"
GITEA_TOKEN="${GITEA_TOKEN:?Set GITEA_TOKEN env var}"
REPO="${REPO:-Timmy_Foundation/the-nexus}"
DRY_RUN="${DRY_RUN:-true}"
if [ -z "$GITEA_TOKEN" ]; then
echo "ERROR: GITEA_TOKEN not set"
exit 1
fi
# Parse command line arguments
for arg in "$@"; do
case $arg in
--dry-run)
DRY_RUN="true"
;;
--close)
DRY_RUN="false"
;;
esac
done
REPO_API="${GITEA_URL}/api/v1/repos/${REPO}"
API="$GITEA_URL/api/v1"
AUTH="token $GITEA_TOKEN"
# Fetch open PRs
PRS=$(curl -sf -H "Authorization: token ${GITEA_TOKEN}" "${REPO_API}/pulls?state=open&limit=50" 2>/dev/null || echo '[]')
log() { echo "[$(date -u +%Y-%m-%dT%H:%M:%SZ)] $*"; }
# Find matching PRs
MATCHES=$(echo "$PRS" | python3 -c "
import json, sys
prs = json.load(sys.stdin)
issue = '${ISSUE_NUM}'
matches = []
for pr in prs:
title = pr.get('title', '')
body = pr.get('body', '')
ref = pr.get('head', {}).get('ref', '')
if f'#{issue}' in title or f'#{issue}' in body or issue in ref:
matches.append(pr)
json.dump(matches, sys.stdout)
" 2>/dev/null || echo '[]')
# ─── Fetch open PRs ────────────────────────────────────────
log "Checking open PRs for $REPO (dry_run: $DRY_RUN)"
COUNT=$(echo "$MATCHES" | python3 -c "import json,sys; print(len(json.load(sys.stdin)))" 2>/dev/null || echo '0')
OPEN_PRS=$(curl -s -H "$AUTH" "$API/repos/$REPO/pulls?state=open&limit=50")
if [ "$COUNT" -eq 0 ]; then
echo "No PRs found for issue #$ISSUE_NUM"
if [ -z "$OPEN_PRS" ] || [ "$OPEN_PRS" = "null" ]; then
log "No open PRs found or API error"
exit 0
fi
echo "Found $COUNT PR(s) for issue #$ISSUE_NUM:"
echo "$MATCHES" | python3 -c "
import json, sys
prs = json.load(sys.stdin)
for pr in prs:
print(f" #{pr['number']}: {pr['title']} [{pr['head']['ref']}]")
"
# Count PRs
PR_COUNT=$(echo "$OPEN_PRS" | jq length)
log "Found $PR_COUNT open PRs"
if [ "$COUNT" -le 1 ]; then
echo ""
echo "Only 1 PR found. No cleanup needed."
if [ "$PR_COUNT" -eq 0 ]; then
log "No open PRs to process"
exit 0
fi
# Keep the oldest PR, close the rest
echo "$MATCHES" | python3 -c "
import json, sys
prs = json.load(sys.stdin)
prs.sort(key=lambda p: p['number'])
keep = prs[0]
close = prs[1:]
print(f'KEEP: #{keep["number"]}: {keep["title"]}')
for pr in close:
print(f'CLOSE: #{pr["number"]}: {pr["title"]}')
"
# ─── Extract issue numbers from PR titles ──────────────────
# Create a temporary file for PR data
TEMP_FILE=$(mktemp)
echo "$OPEN_PRS" | jq -r '.[] | "\(.number)\t\(.title)\t\(.created_at)\t\(.head.ref)"' > "$TEMP_FILE"
if [ "$CLOSE_MODE" != "--close" ]; then ""
echo ""
echo "DRY RUN: Add --close to actually close duplicates"
exit 0
# Group PRs by issue number using temporary files
TEMP_DIR=$(mktemp -d)
trap "rm -rf $TEMP_DIR" EXIT
while IFS=$'\t' read -r pr_number pr_title pr_created pr_branch; do
# Extract issue number from title (look for #123 pattern)
if [[ $pr_title =~ \#([0-9]+) ]]; then
issue_num="${BASH_REMATCH[1]}"
echo "$pr_number,$pr_created,$pr_branch" >> "$TEMP_DIR/issue_$issue_num.txt"
fi
done < "$TEMP_FILE"
rm -f "$TEMP_FILE"
# ─── Identify and process duplicates ──────────────────────
DUPLICATES_FOUND=0
CLOSED_COUNT=0
for issue_file in "$TEMP_DIR"/issue_*.txt; do
[ -f "$issue_file" ] || continue
issue_num=$(basename "$issue_file" .txt | sed 's/issue_//')
pr_list=$(cat "$issue_file")
# Count PRs for this issue
pr_count=$(echo -n "$pr_list" | grep -c '^' || true)
if [ "$pr_count" -le 1 ]; then
continue # No duplicates
fi
log "Issue #$issue_num has $pr_count open PRs"
DUPLICATES_FOUND=$((DUPLICATES_FOUND + 1))
# Sort by creation date (oldest first)
sorted_prs=$(echo -n "$pr_list" | sort -t',' -k2)
# Keep the newest PR, close the rest
newest_pr=""
newest_date=""
while IFS=',' read -r pr_num pr_date pr_branch; do
if [ -z "$newest_date" ] || [[ "$pr_date" > "$newest_date" ]]; then
newest_pr="$pr_num"
newest_date="$pr_date"
fi
done <<< "$sorted_prs"
log "Keeping PR #$newest_pr (newest)"
# Close older PRs
while IFS=',' read -r pr_num pr_date pr_branch; do
if [ "$pr_num" = "$newest_pr" ]; then
continue # Skip the newest PR
fi
log "Closing duplicate PR #$pr_num for issue #$issue_num"
if [ "$DRY_RUN" = "true" ]; then
log "DRY RUN: Would close PR #$pr_num"
else
# Add a comment explaining why we're closing
comment_body="Closing as duplicate. PR #$newest_pr is newer and addresses the same issue (#$issue_num)."
curl -s -X POST -H "$AUTH" -H "Content-Type: application/json" -d "{\"body\": \"$comment_body\"}" "$API/repos/$REPO/issues/$pr_num/comments" > /dev/null
# Close the PR
curl -s -X PATCH -H "$AUTH" -H "Content-Type: application/json" -d '{"state": "closed"}' "$API/repos/$REPO/pulls/$pr_num" > /dev/null
log "Closed PR #$pr_num"
CLOSED_COUNT=$((CLOSED_COUNT + 1))
fi
done <<< "$sorted_prs"
done
# ─── Summary ──────────────────────────────────────────────
log "Cleanup complete:"
log " Duplicate issue groups found: $DUPLICATES_FOUND"
log " PRs closed: $CLOSED_COUNT"
log " Dry run: $DRY_RUN"
if [ "$DUPLICATES_FOUND" -eq 0 ]; then
log "No duplicate PRs found"
fi
# Close duplicates
echo "$MATCHES" | python3 -c "
import json, sys, urllib.request, os
prs = json.load(sys.stdin)
prs.sort(key=lambda p: p['number'])
token = '${GITEA_TOKEN}'
api = '${REPO_API}'
for pr in prs[1:]:
url = f'{api}/pulls/{pr["number"]}'
data = json.dumps({'state': 'closed'}).encode()
req = urllib.request.Request(url, data=data, headers={'Authorization': f'token {token}', 'Content-Type': 'application/json'}, method='PATCH')
try:
urllib.request.urlopen(req)
print(f'Closed PR #{pr["number"]}')
except Exception as e:
print(f'Error closing #{pr["number"]}: {e}')
"
# ─── Additional cleanup: Stale PRs ────────────────────────
# Check for PRs older than 30 days with no activity
log "Checking for stale PRs (older than 30 days)..."
echo ""
echo "Cleanup complete."
THIRTY_DAYS_AGO=$(date -u -v-30d +%Y-%m-%dT%H:%M:%SZ 2>/dev/null || date -u -d "30 days ago" +%Y-%m-%dT%H:%M:%SZ)
STALE_PRS=$(echo "$OPEN_PRS" | jq -r --arg cutoff "$THIRTY_DAYS_AGO" '.[] | select(.created_at < $cutoff) | "\(.number)\t\(.title)\t\(.created_at)"')
if [ -n "$STALE_PRS" ]; then
STALE_COUNT=$(echo -n "$STALE_PRS" | grep -c '^' || true)
log "Found $STALE_COUNT stale PRs (older than 30 days)"
echo "$STALE_PRS" | while IFS=$'\t' read -r pr_num pr_title pr_created; do
log "Stale PR #$pr_num: $pr_title (created: $pr_created)"
done
else
log "No stale PRs found"
fi
log "Script complete"

View File

@@ -1,82 +0,0 @@
#!/usr/bin/env bash
# preflight-pr-check.sh — Prevent duplicate PRs before creating them
#
# Usage:
# ./scripts/preflight-pr-check.sh <issue_number>
#
# Exit codes:
# 0 = safe to proceed (no existing PRs)
# 1 = BLOCKED (existing PRs found)
# 2 = error
set -euo pipefail
ISSUE_NUM="${1:?Usage: preflight-pr-check.sh <issue_number>}"
GITEA_URL="${GITEA_URL:-https://forge.alexanderwhitestone.com}"
GITEA_TOKEN="${GITEA_TOKEN:-$(cat ~/.config/gitea/token 2>/dev/null || echo '')}"
REPO="${GITEA_REPO:-Timmy_Foundation/the-nexus}"
if [ -z "$GITEA_TOKEN" ]; then
echo "ERROR: GITEA_TOKEN not set and ~/.config/gitea/token not found"
exit 2
fi
# Get repo info
REPO_API="${GITEA_URL}/api/v1/repos/${REPO}"
# Fetch open PRs
PRS=$(curl -sf -H "Authorization: token ${GITEA_TOKEN}" "${REPO_API}/pulls?state=open&limit=50" 2>/dev/null || echo '[]')
# Check for existing PRs referencing this issue
MATCHING_PRS=$(echo "$PRS" | python3 -c "
import json, sys
prs = json.load(sys.stdin)
issue = '${ISSUE_NUM}'
matches = []
for pr in prs:
title = pr.get('title', '')
body = pr.get('body', '')
ref = pr.get('head', {}).get('ref', '')
if f'#{issue}' in title or f'#{issue}' in body or issue in ref:
matches.append({
'number': pr['number'],
'title': title,
'branch': ref,
'url': pr.get('html_url', '')
})
json.dump(matches, sys.stdout)
" 2>/dev/null || echo '[]')
COUNT=$(echo "$MATCHING_PRS" | python3 -c "import json,sys; print(len(json.load(sys.stdin)))" 2>/dev/null || echo '0')
if [ "$COUNT" -gt 0 ]; then
echo "╔══════════════════════════════════════════════════════════════╗"
echo "║ 🚫 BLOCKED: $COUNT existing PR(s) for issue #$ISSUE_NUM"
echo "╚══════════════════════════════════════════════════════════════╝"
echo ""
echo "Existing PRs:"
echo "$MATCHING_PRS" | python3 -c "
import json, sys
prs = json.load(sys.stdin)
for pr in prs:
print(f" #{pr['number']}: {pr['title']}")
print(f" Branch: {pr['branch']}")
print(f" URL: {pr['url']}")
print()
"
echo "Options:"
echo " 1. Review and merge an existing PR"
echo " 2. Close duplicates and proceed"
echo " 3. Use --force to bypass (NOT RECOMMENDED)"
echo ""
if [ "${2:-}" = "--force" ]; then
echo "⚠️ --force flag detected. Bypassing duplicate check."
exit 0
fi
exit 1
else
echo "✅ Safe to proceed: No existing PRs for issue #$ISSUE_NUM"
exit 0
fi

View File

@@ -0,0 +1,29 @@
from pathlib import Path
ROOT = Path(__file__).resolve().parent.parent
REPORT_PATH = ROOT / "reviews" / "2026-04-12-forge-wide-qa-pass.md"
def test_forge_wide_qa_pass_report_exists():
assert REPORT_PATH.exists(), "missing forge-wide QA pass report artifact"
def test_forge_wide_qa_pass_report_preserves_key_findings():
text = REPORT_PATH.read_text(encoding="utf-8")
required = [
"# Forge-Wide QA Pass — 2026-04-12 Evening",
"**Reviewer:** Perplexity",
"**Scope:** All 6 Timmy Foundation repos",
"## Summary",
"the-nexus swarm pileup (again)",
"Queue throttle not deployed yet",
"CAPTCHA bypass tool in timmy-config #499",
"the-beacon Gemini bloat",
"Paper PRs need polish (timmy-home)",
"## Recommended Merge Order (the-nexus)",
"#1327",
"Reference: perplexity-status-report-2026-04-12-evening",
]
for snippet in required:
assert snippet in text, f"missing report detail: {snippet}"