Compare commits
1 Commits
fix/1117
...
burn/1460-
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
34e004e842 |
@@ -6,4 +6,3 @@ rules:
|
||||
require_ci_to_merge: false # CI runner dead (issue #915)
|
||||
block_force_pushes: true
|
||||
block_deletions: true
|
||||
block_on_outdated_branch: true
|
||||
|
||||
72
.gitea/workflows/duplicate-pr-check.yml
Normal file
72
.gitea/workflows/duplicate-pr-check.yml
Normal file
@@ -0,0 +1,72 @@
|
||||
# .gitea/workflows/duplicate-pr-check.yml
|
||||
# CI workflow to check for duplicate PRs
|
||||
|
||||
name: Check for Duplicate PRs
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened]
|
||||
|
||||
jobs:
|
||||
check-duplicates:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
# No additional dependencies needed
|
||||
|
||||
- name: Check for duplicate PRs
|
||||
env:
|
||||
GITEA_TOKEN: ${{ secrets.GITEA_TOKEN }}
|
||||
run: |
|
||||
# Extract issue number from PR title or branch name
|
||||
PR_TITLE="${{ github.event.pull_request.title }}"
|
||||
BRANCH_NAME="${{ github.head_ref }}"
|
||||
|
||||
# Try to extract issue number from title or branch
|
||||
ISSUE_NUM=$(echo "$PR_TITLE" | grep -oE '#[0-9]+' | head -1 | tr -d '#')
|
||||
|
||||
if [ -z "$ISSUE_NUM" ]; then
|
||||
ISSUE_NUM=$(echo "$BRANCH_NAME" | grep -oE '[0-9]+' | head -1)
|
||||
fi
|
||||
|
||||
if [ -z "$ISSUE_NUM" ]; then
|
||||
echo "No issue number found in PR title or branch name"
|
||||
echo "Skipping duplicate check"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "Checking for duplicate PRs for issue #$ISSUE_NUM"
|
||||
|
||||
# Save token to file for the script
|
||||
echo "$GITEA_TOKEN" > /tmp/gitea_token.txt
|
||||
export TOKEN_PATH=/tmp/gitea_token.txt
|
||||
|
||||
# Run the duplicate checker
|
||||
python bin/duplicate_pr_prevention.py --repo the-nexus --issue "$ISSUE_NUM" --check
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
echo ""
|
||||
echo "❌ Duplicate PRs detected for issue #$ISSUE_NUM"
|
||||
echo "This PR should be closed in favor of an existing one."
|
||||
echo ""
|
||||
echo "To see details, run:"
|
||||
echo " python bin/duplicate_pr_prevention.py --repo the-nexus --issue $ISSUE_NUM --report"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "✅ No duplicate PRs found"
|
||||
|
||||
- name: Clean up
|
||||
if: always()
|
||||
run: |
|
||||
rm -f /tmp/gitea_token.txt
|
||||
1
.github/BRANCH_PROTECTION.md
vendored
1
.github/BRANCH_PROTECTION.md
vendored
@@ -12,7 +12,6 @@ All repositories must enforce these rules on the `main` branch:
|
||||
| Require CI to pass | ⚠ Conditional | Only where CI exists |
|
||||
| Block force push | ✅ Enabled | Protect commit history |
|
||||
| Block branch deletion | ✅ Enabled | Prevent accidental deletion |
|
||||
| Require branch up-to-date before merge | ✅ Enabled | Surface conflicts before merge and force contributors to rebase |
|
||||
|
||||
## Default Reviewer Assignments
|
||||
|
||||
|
||||
241
DUPLICATE_PR_PREVENTION.md
Normal file
241
DUPLICATE_PR_PREVENTION.md
Normal file
@@ -0,0 +1,241 @@
|
||||
# Duplicate PR Prevention System
|
||||
|
||||
**Issue:** #1460 - [META] I keep creating duplicate PRs for issue #1128
|
||||
**Solution:** Comprehensive prevention system with tools, hooks, and CI checks
|
||||
|
||||
## Problem Statement
|
||||
|
||||
Issue #1460 describes a meta-problem: creating 7 duplicate PRs for issue #1128, which was itself about cleaning up duplicate PRs. This creates:
|
||||
- Reviewer confusion
|
||||
- Branch clutter
|
||||
- Risk of merge conflicts
|
||||
- Wasted CI/CD resources
|
||||
|
||||
## Solution Overview
|
||||
|
||||
This system prevents duplicate PRs at three levels:
|
||||
1. **Local Prevention** — Git hooks that check before pushing
|
||||
2. **CI/CD Prevention** — Workflows that check when PRs are created
|
||||
3. **Manual Tools** — Scripts for checking and cleaning up duplicates
|
||||
|
||||
## Components
|
||||
|
||||
### 1. `bin/duplicate_pr_prevention.py`
|
||||
Main prevention script with three modes:
|
||||
|
||||
**Check for duplicates:**
|
||||
```bash
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --check
|
||||
```
|
||||
|
||||
**Clean up duplicates:**
|
||||
```bash
|
||||
# Dry run (see what would be closed)
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --cleanup --dry-run
|
||||
|
||||
# Actually close duplicates
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --cleanup
|
||||
```
|
||||
|
||||
**Generate report:**
|
||||
```bash
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --report
|
||||
```
|
||||
|
||||
### 2. `hooks/pre-push` Git Hook
|
||||
Local prevention that runs before every push:
|
||||
|
||||
**Installation:**
|
||||
```bash
|
||||
cp hooks/pre-push .git/hooks/pre-push
|
||||
chmod +x .git/hooks/pre-push
|
||||
```
|
||||
|
||||
**How it works:**
|
||||
1. Extracts issue number from branch name (e.g., `fix/1128-something` → `1128`)
|
||||
2. Checks for existing PRs for that issue
|
||||
3. Blocks push if duplicates found
|
||||
4. Provides instructions for resolution
|
||||
|
||||
### 3. `.gitea/workflows/duplicate-pr-check.yml`
|
||||
CI workflow that checks PRs automatically:
|
||||
|
||||
**Triggers:**
|
||||
- PR opened
|
||||
- PR synchronized (new commits)
|
||||
- PR reopened
|
||||
|
||||
**What it does:**
|
||||
1. Extracts issue number from PR title or branch name
|
||||
2. Checks for existing PRs
|
||||
3. Fails CI if duplicates found
|
||||
4. Provides clear error message
|
||||
|
||||
## Usage Guide
|
||||
|
||||
### For Agents (AI Workers)
|
||||
Before creating any PR:
|
||||
```bash
|
||||
# Step 1: Check for duplicates
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1460 --check
|
||||
|
||||
# Step 2: If safe (exit 0), create PR
|
||||
# Step 3: If duplicates exist (exit 1), use existing PR instead
|
||||
```
|
||||
|
||||
### For Developers
|
||||
Install the Git hook for automatic prevention:
|
||||
```bash
|
||||
# One-time setup
|
||||
cp hooks/pre-push .git/hooks/pre-push
|
||||
chmod +x .git/hooks/pre-push
|
||||
|
||||
# Now git push will automatically check for duplicates
|
||||
git push # Will be blocked if duplicates exist
|
||||
```
|
||||
|
||||
### For CI/CD
|
||||
The workflow runs automatically on all PRs. No setup needed.
|
||||
|
||||
## Examples
|
||||
|
||||
### Check for duplicates:
|
||||
```bash
|
||||
$ python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --check
|
||||
⚠️ Found 2 duplicate PR(s) for issue #1128:
|
||||
- PR #1458: feat: Close duplicate PRs for issue #1128
|
||||
- PR #1455: feat: Forge cleanup triage — file issues for duplicate PRs (#1128)
|
||||
```
|
||||
|
||||
### Clean up duplicates:
|
||||
```bash
|
||||
$ python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --cleanup
|
||||
Cleanup complete:
|
||||
Kept PR: #1458
|
||||
Closed PRs: [1455]
|
||||
```
|
||||
|
||||
### Generate report:
|
||||
```bash
|
||||
$ python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --report
|
||||
# Duplicate PR Prevention Report
|
||||
|
||||
**Repository:** the-nexus
|
||||
**Issue:** #1128
|
||||
**Generated:** 2026-04-14T23:30:00
|
||||
|
||||
## Current Status
|
||||
|
||||
⚠️ **Found 2 duplicate PR(s)**
|
||||
|
||||
- **PR #1458**: feat: Close duplicate PRs for issue #1128
|
||||
- Branch: fix/1128-cleanup
|
||||
- Created: 2026-04-14T22:00:00
|
||||
- Author: agent
|
||||
|
||||
- **PR #1455**: feat: Forge cleanup triage — file issues for duplicate PRs (#1128)
|
||||
- Branch: triage/1128-1776129677
|
||||
- Created: 2026-04-14T20:00:00
|
||||
- Author: agent
|
||||
|
||||
## Recommendations
|
||||
|
||||
1. **Review existing PRs** — Check which one is the best solution
|
||||
2. **Keep the newest** — Usually the most up-to-date
|
||||
3. **Close duplicates** — Use cleanup_duplicate_prs.py
|
||||
4. **Prevent future duplicates** — Use check_duplicate_pr.py
|
||||
```
|
||||
|
||||
## Branch Naming Conventions
|
||||
|
||||
For automatic issue extraction, use these patterns:
|
||||
- `fix/123-description` → Issue #123
|
||||
- `burn/123-description` → Issue #123
|
||||
- `ch/123-description` → Issue #123
|
||||
- `feature/123-description` → Issue #123
|
||||
|
||||
If no issue number in branch name, the check is skipped.
|
||||
|
||||
## Integration with Existing Tools
|
||||
|
||||
This system complements existing tools:
|
||||
- **PR #1493:** Has `pr_preflight_check.py` — similar functionality
|
||||
- **PR #1497:** Has `check_duplicate_pr.py` — similar functionality
|
||||
|
||||
This system provides additional features:
|
||||
1. **Git hooks** for local prevention
|
||||
2. **CI workflows** for automated checking
|
||||
3. **Cleanup tools** for closing duplicates
|
||||
4. **Comprehensive reporting**
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Hook not working?
|
||||
```bash
|
||||
# Check if hook is installed
|
||||
ls -la .git/hooks/pre-push
|
||||
|
||||
# Make sure it's executable
|
||||
chmod +x .git/hooks/pre-push
|
||||
|
||||
# Test it manually
|
||||
./.git/hooks/pre-push
|
||||
```
|
||||
|
||||
### CI failing?
|
||||
1. Check if `GITEA_TOKEN` secret is set
|
||||
2. Verify issue number can be extracted from PR title/branch
|
||||
3. Check workflow logs for details
|
||||
|
||||
### False positives?
|
||||
If the script incorrectly identifies duplicates:
|
||||
1. Check PR titles and bodies for issue references
|
||||
2. Use `--report` to see what's being detected
|
||||
3. Manually close incorrect PRs if needed
|
||||
|
||||
## Prevention Strategy
|
||||
|
||||
### 1. **Always Check First**
|
||||
```bash
|
||||
# Before creating any PR
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1460 --check
|
||||
```
|
||||
|
||||
### 2. **Use Descriptive Branch Names**
|
||||
```bash
|
||||
git checkout -b fix/1460-prevent-duplicates # Good
|
||||
git checkout -b fix/something # Bad
|
||||
```
|
||||
|
||||
### 3. **Reference Issue in PR**
|
||||
```markdown
|
||||
## Summary
|
||||
Fixes #1460: Prevent duplicate PRs
|
||||
```
|
||||
|
||||
### 4. **Review Before Creating**
|
||||
```bash
|
||||
# See what PRs already exist
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1460 --report
|
||||
```
|
||||
|
||||
## Related Issues
|
||||
|
||||
- **Issue #1460:** This implementation
|
||||
- **Issue #1128:** Original issue that had 7 duplicate PRs
|
||||
- **Issue #1449:** [URGENT] 5 duplicate PRs for issue #1128 need cleanup
|
||||
- **Issue #1474:** [META] Still creating duplicate PRs for issue #1128 despite cleanup
|
||||
- **Issue #1480:** [META] 4th duplicate PR for issue #1128 — need intervention
|
||||
|
||||
## Files
|
||||
|
||||
```
|
||||
bin/duplicate_pr_prevention.py # Main prevention script
|
||||
hooks/pre-push # Git hook for local prevention
|
||||
.gitea/workflows/duplicate-pr-check.yml # CI workflow
|
||||
DUPLICATE_PR_PREVENTION.md # This documentation
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
Part of the Timmy Foundation project.
|
||||
@@ -285,49 +285,6 @@ class AgentMemory:
|
||||
logger.warning(f"Failed to store memory: {e}")
|
||||
return None
|
||||
|
||||
def remember_alexander_request_response(
|
||||
self,
|
||||
*,
|
||||
request_text: str,
|
||||
response_text: str,
|
||||
requester: str = "Alexander Whitestone",
|
||||
source: str = "",
|
||||
metadata: Optional[dict] = None,
|
||||
) -> Optional[str]:
|
||||
"""Store an Alexander request + wizard response artifact in the sovereign room."""
|
||||
if not self._check_available():
|
||||
logger.warning("Cannot store Alexander artifact — MemPalace unavailable")
|
||||
return None
|
||||
|
||||
try:
|
||||
from nexus.mempalace.searcher import add_memory
|
||||
from nexus.mempalace.conversation_artifacts import build_request_response_artifact
|
||||
|
||||
artifact = build_request_response_artifact(
|
||||
requester=requester,
|
||||
responder=self.agent_name,
|
||||
request_text=request_text,
|
||||
response_text=response_text,
|
||||
source=source,
|
||||
)
|
||||
extra_metadata = dict(artifact.metadata)
|
||||
if metadata:
|
||||
extra_metadata.update(metadata)
|
||||
|
||||
doc_id = add_memory(
|
||||
text=artifact.text,
|
||||
room=artifact.room,
|
||||
wing=self.wing,
|
||||
palace_path=self.palace_path,
|
||||
source_file=source,
|
||||
extra_metadata=extra_metadata,
|
||||
)
|
||||
logger.debug("Stored Alexander request/response artifact in sovereign room")
|
||||
return doc_id
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to store Alexander artifact: {e}")
|
||||
return None
|
||||
|
||||
def write_diary(
|
||||
self,
|
||||
summary: Optional[str] = None,
|
||||
|
||||
8
app.js
8
app.js
@@ -714,10 +714,6 @@ async function init() {
|
||||
camera = new THREE.PerspectiveCamera(65, window.innerWidth / window.innerHeight, 0.1, 1000);
|
||||
camera.position.copy(playerPos);
|
||||
|
||||
// Initialize avatar and LOD systems
|
||||
if (window.AvatarCustomization) window.AvatarCustomization.init(scene, camera);
|
||||
if (window.LODSystem) window.LODSystem.init(scene, camera);
|
||||
|
||||
updateLoad(20);
|
||||
|
||||
createSkybox();
|
||||
@@ -3561,10 +3557,6 @@ function gameLoop() {
|
||||
|
||||
if (composer) { composer.render(); } else { renderer.render(scene, camera); }
|
||||
|
||||
// Update avatar and LOD systems
|
||||
if (window.AvatarCustomization && playerPos) window.AvatarCustomization.update(playerPos);
|
||||
if (window.LODSystem && playerPos) window.LODSystem.update(playerPos);
|
||||
|
||||
updateAshStorm(delta, elapsed);
|
||||
|
||||
// Project Mnemosyne - Memory Orb Animation
|
||||
|
||||
230
bin/duplicate_pr_prevention.py
Executable file
230
bin/duplicate_pr_prevention.py
Executable file
@@ -0,0 +1,230 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Duplicate PR Prevention System for Timmy Foundation
|
||||
Prevents the issue described in #1460: creating duplicate PRs for the same issue.
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import urllib.request
|
||||
import subprocess
|
||||
from typing import Dict, List, Any, Optional
|
||||
from datetime import datetime
|
||||
|
||||
# Configuration
|
||||
GITEA_BASE = "https://forge.alexanderwhitestone.com/api/v1"
|
||||
TOKEN_PATH = os.path.expanduser("~/.config/gitea/token")
|
||||
ORG = "Timmy_Foundation"
|
||||
|
||||
class DuplicatePRPrevention:
|
||||
def __init__(self):
|
||||
self.token = self._load_token()
|
||||
|
||||
def _load_token(self) -> str:
|
||||
"""Load Gitea API token."""
|
||||
try:
|
||||
with open(TOKEN_PATH, "r") as f:
|
||||
return f.read().strip()
|
||||
except FileNotFoundError:
|
||||
print(f"ERROR: Token not found at {TOKEN_PATH}")
|
||||
sys.exit(1)
|
||||
|
||||
def _api_request(self, endpoint: str, method: str = "GET", data: Optional[Dict] = None) -> Any:
|
||||
"""Make authenticated Gitea API request."""
|
||||
url = f"{GITEA_BASE}{endpoint}"
|
||||
headers = {
|
||||
"Authorization": f"token {self.token}",
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
req = urllib.request.Request(url, headers=headers, method=method)
|
||||
if data:
|
||||
req.data = json.dumps(data).encode()
|
||||
|
||||
try:
|
||||
with urllib.request.urlopen(req) as resp:
|
||||
if resp.status == 204: # No content
|
||||
return {"status": "success", "code": resp.status}
|
||||
return json.loads(resp.read())
|
||||
except urllib.error.HTTPError as e:
|
||||
error_body = e.read().decode() if e.fp else "No error body"
|
||||
print(f"API Error {e.code}: {error_body}")
|
||||
return {"error": e.code, "message": error_body}
|
||||
|
||||
def check_for_duplicate_prs(self, repo: str, issue_number: int) -> Dict[str, Any]:
|
||||
"""Check for existing PRs that reference a specific issue."""
|
||||
# Get all open PRs
|
||||
endpoint = f"/repos/{ORG}/{repo}/pulls?state=open"
|
||||
prs = self._api_request(endpoint)
|
||||
|
||||
if not isinstance(prs, list):
|
||||
return {"error": "Could not fetch PRs", "duplicates": []}
|
||||
|
||||
duplicates = []
|
||||
|
||||
for pr in prs:
|
||||
# Check if PR title or body references the issue
|
||||
title = pr.get('title', '').lower()
|
||||
body = pr.get('body', '').lower() if pr.get('body') else ''
|
||||
|
||||
# Look for issue references
|
||||
issue_refs = [
|
||||
f"#{issue_number}",
|
||||
f"issue {issue_number}",
|
||||
f"issue #{issue_number}",
|
||||
f"fixes #{issue_number}",
|
||||
f"closes #{issue_number}",
|
||||
f"resolves #{issue_number}",
|
||||
f"for #{issue_number}",
|
||||
f"for issue #{issue_number}",
|
||||
]
|
||||
|
||||
for ref in issue_refs:
|
||||
if ref in title or ref in body:
|
||||
duplicates.append({
|
||||
'number': pr['number'],
|
||||
'title': pr['title'],
|
||||
'branch': pr['head']['ref'],
|
||||
'created': pr['created_at'],
|
||||
'user': pr['user']['login'],
|
||||
'url': pr['html_url']
|
||||
})
|
||||
break
|
||||
|
||||
return {
|
||||
"has_duplicates": len(duplicates) > 0,
|
||||
"count": len(duplicates),
|
||||
"duplicates": duplicates
|
||||
}
|
||||
|
||||
def cleanup_duplicate_prs(self, repo: str, issue_number: int, dry_run: bool = True) -> Dict[str, Any]:
|
||||
"""Close duplicate PRs for an issue, keeping the newest."""
|
||||
duplicates = self.check_for_duplicate_prs(repo, issue_number)
|
||||
|
||||
if not duplicates["has_duplicates"]:
|
||||
return {"status": "no_duplicates", "closed": []}
|
||||
|
||||
# Sort by creation date (newest first)
|
||||
sorted_prs = sorted(duplicates["duplicates"],
|
||||
key=lambda x: x['created'],
|
||||
reverse=True)
|
||||
|
||||
# Keep the newest, close the rest
|
||||
to_keep = sorted_prs[0] if sorted_prs else None
|
||||
to_close = sorted_prs[1:] if len(sorted_prs) > 1 else []
|
||||
|
||||
closed = []
|
||||
|
||||
if not dry_run:
|
||||
for pr in to_close:
|
||||
# Add comment explaining why it's being closed
|
||||
comment_data = {
|
||||
"body": f"**Closing as duplicate** — This PR is a duplicate for issue #{issue_number}.\n\n"
|
||||
f"Keeping PR #{to_keep['number']} instead.\n\n"
|
||||
f"This is an automated cleanup to prevent duplicate PRs.\n"
|
||||
f"See issue #1460 for context."
|
||||
}
|
||||
|
||||
# Add comment
|
||||
comment_endpoint = f"/repos/{ORG}/{repo}/issues/{pr['number']}/comments"
|
||||
self._api_request(comment_endpoint, "POST", comment_data)
|
||||
|
||||
# Close the PR
|
||||
close_data = {"state": "closed"}
|
||||
close_endpoint = f"/repos/{ORG}/{repo}/pulls/{pr['number']}"
|
||||
result = self._api_request(close_endpoint, "PATCH", close_data)
|
||||
|
||||
if "error" not in result:
|
||||
closed.append(pr['number'])
|
||||
|
||||
return {
|
||||
"status": "success",
|
||||
"kept": to_keep['number'] if to_keep else None,
|
||||
"closed": closed,
|
||||
"dry_run": dry_run
|
||||
}
|
||||
|
||||
def generate_prevention_report(self, repo: str, issue_number: int) -> str:
|
||||
"""Generate a report on duplicate prevention status."""
|
||||
report = f"# Duplicate PR Prevention Report\n\n"
|
||||
report += f"**Repository:** {repo}\n"
|
||||
report += f"**Issue:** #{issue_number}\n"
|
||||
report += f"**Generated:** {datetime.now().isoformat()}\n\n"
|
||||
|
||||
# Check for duplicates
|
||||
duplicates = self.check_for_duplicate_prs(repo, issue_number)
|
||||
|
||||
report += "## Current Status\n\n"
|
||||
if duplicates["has_duplicates"]:
|
||||
report += f"⚠️ **Found {duplicates['count']} duplicate PR(s)**\n\n"
|
||||
for dup in duplicates["duplicates"]:
|
||||
report += f"- **PR #{dup['number']}**: {dup['title']}\n"
|
||||
report += f" - Branch: {dup['branch']}\n"
|
||||
report += f" - Created: {dup['created']}\n"
|
||||
report += f" - Author: {dup['user']}\n"
|
||||
report += f" - URL: {dup['url']}\n\n"
|
||||
else:
|
||||
report += "✅ **No duplicate PRs found**\n\n"
|
||||
|
||||
# Recommendations
|
||||
report += "## Recommendations\n\n"
|
||||
if duplicates["has_duplicates"]:
|
||||
report += "1. **Review existing PRs** — Check which one is the best solution\n"
|
||||
report += "2. **Keep the newest** — Usually the most up-to-date\n"
|
||||
report += "3. **Close duplicates** — Use cleanup_duplicate_prs.py\n"
|
||||
report += "4. **Prevent future duplicates** — Use check_duplicate_pr.py\n"
|
||||
else:
|
||||
report += "1. **Safe to create PR** — No duplicates exist\n"
|
||||
report += "2. **Use prevention tools** — Always check before creating PRs\n"
|
||||
report += "3. **Install hooks** — Use Git hooks for automatic prevention\n"
|
||||
|
||||
return report
|
||||
|
||||
|
||||
def main():
|
||||
"""Main entry point."""
|
||||
import argparse
|
||||
|
||||
parser = argparse.ArgumentParser(description="Duplicate PR Prevention System")
|
||||
parser.add_argument("--repo", required=True, help="Repository name (e.g., the-nexus)")
|
||||
parser.add_argument("--issue", required=True, type=int, help="Issue number")
|
||||
parser.add_argument("--check", action="store_true", help="Check for duplicates")
|
||||
parser.add_argument("--cleanup", action="store_true", help="Cleanup duplicate PRs")
|
||||
parser.add_argument("--dry-run", action="store_true", help="Dry run for cleanup")
|
||||
parser.add_argument("--report", action="store_true", help="Generate report")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
prevention = DuplicatePRPrevention()
|
||||
|
||||
if args.check:
|
||||
result = prevention.check_for_duplicate_prs(args.repo, args.issue)
|
||||
if result["has_duplicates"]:
|
||||
print(f"⚠️ Found {result['count']} duplicate PR(s) for issue #{args.issue}:")
|
||||
for dup in result["duplicates"]:
|
||||
print(f" - PR #{dup['number']}: {dup['title']}")
|
||||
sys.exit(1)
|
||||
else:
|
||||
print(f"✅ No duplicate PRs found for issue #{args.issue}")
|
||||
sys.exit(0)
|
||||
|
||||
elif args.cleanup:
|
||||
result = prevention.cleanup_duplicate_prs(args.repo, args.issue, args.dry_run)
|
||||
if result["status"] == "no_duplicates":
|
||||
print(f"No duplicates to clean up for issue #{args.issue}")
|
||||
else:
|
||||
print(f"Cleanup {'(dry run) ' if args.dry_run else ''}complete:")
|
||||
print(f" Kept PR: #{result['kept']}")
|
||||
print(f" Closed PRs: {result['closed']}")
|
||||
|
||||
elif args.report:
|
||||
report = prevention.generate_prevention_report(args.repo, args.issue)
|
||||
print(report)
|
||||
|
||||
else:
|
||||
parser.print_help()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
59
hooks/pre-push
Normal file
59
hooks/pre-push
Normal file
@@ -0,0 +1,59 @@
|
||||
#!/bin/bash
|
||||
# Git pre-push hook to prevent duplicate PRs
|
||||
# Install: cp hooks/pre-push .git/hooks/pre-push && chmod +x .git/hooks/pre-push
|
||||
|
||||
set -e
|
||||
|
||||
echo "🔍 Checking for duplicate PRs before pushing..."
|
||||
|
||||
# Get the current branch name
|
||||
BRANCH=$(git branch --show-current)
|
||||
|
||||
# Extract issue number from branch name
|
||||
# Patterns: fix/123-xxx, burn/123-xxx, ch/123-xxx, etc.
|
||||
ISSUE_NUM=$(echo "$BRANCH" | grep -oE '[0-9]+' | head -1)
|
||||
|
||||
if [ -z "$ISSUE_NUM" ]; then
|
||||
echo "ℹ️ No issue number found in branch name: $BRANCH"
|
||||
echo " Skipping duplicate check..."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "📋 Found issue #$ISSUE_NUM in branch name"
|
||||
|
||||
# Get repository name from git remote
|
||||
REMOTE_URL=$(git config --get remote.origin.url)
|
||||
if [[ "$REMOTE_URL" == *"Timmy_Foundation/"* ]]; then
|
||||
REPO=$(echo "$REMOTE_URL" | sed 's/.*Timmy_Foundation\///' | sed 's/\.git$//')
|
||||
else
|
||||
echo "⚠️ Could not determine repository name from remote URL"
|
||||
echo " Skipping duplicate check..."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "📦 Repository: $REPO"
|
||||
|
||||
# Run the duplicate checker
|
||||
if [ -f "bin/duplicate_pr_prevention.py" ]; then
|
||||
python3 bin/duplicate_pr_prevention.py --repo "$REPO" --issue "$ISSUE_NUM" --check
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
echo ""
|
||||
echo "❌ PUSH BLOCKED: Duplicate PRs exist for issue #$ISSUE_NUM"
|
||||
echo ""
|
||||
echo "To resolve:"
|
||||
echo " 1. Review existing PRs: python3 bin/duplicate_pr_prevention.py --repo $REPO --issue $ISSUE_NUM --report"
|
||||
echo " 2. Use existing PR instead of creating a new one"
|
||||
echo " 3. Or clean up duplicates: python3 bin/duplicate_pr_prevention.py --repo $REPO --issue $ISSUE_NUM --cleanup"
|
||||
echo ""
|
||||
echo "To bypass (NOT recommended):"
|
||||
echo " git push --no-verify"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "⚠️ duplicate_pr_prevention.py not found in bin/"
|
||||
echo " Skipping duplicate check..."
|
||||
fi
|
||||
|
||||
echo "✅ No duplicate PRs found. Proceeding with push..."
|
||||
exit 0
|
||||
@@ -395,8 +395,6 @@
|
||||
<div id="memory-connections-panel" class="memory-connections-panel" style="display:none;" aria-label="Memory Connections Panel"></div>
|
||||
|
||||
<script src="./boot.js"></script>
|
||||
<script src="./avatar-customization.js"></script>
|
||||
<script src="./lod-system.js"></script>
|
||||
<script>
|
||||
function openMemoryFilter() { renderFilterList(); document.getElementById('memory-filter').style.display = 'flex'; }
|
||||
function closeMemoryFilter() { document.getElementById('memory-filter').style.display = 'none'; }
|
||||
|
||||
186
lod-system.js
186
lod-system.js
@@ -1,186 +0,0 @@
|
||||
/**
|
||||
* LOD (Level of Detail) System for The Nexus
|
||||
*
|
||||
* Optimizes rendering when many avatars/users are visible:
|
||||
* - Distance-based LOD: far users become billboard sprites
|
||||
* - Occlusion: skip rendering users behind walls
|
||||
* - Budget: maintain 60 FPS target with 50+ avatars
|
||||
*
|
||||
* Usage:
|
||||
* LODSystem.init(scene, camera);
|
||||
* LODSystem.registerAvatar(avatarMesh, userId);
|
||||
* LODSystem.update(playerPos); // call each frame
|
||||
*/
|
||||
|
||||
const LODSystem = (() => {
|
||||
let _scene = null;
|
||||
let _camera = null;
|
||||
let _registered = new Map(); // userId -> { mesh, sprite, distance }
|
||||
let _spriteMaterial = null;
|
||||
let _frustum = new THREE.Frustum();
|
||||
let _projScreenMatrix = new THREE.Matrix4();
|
||||
|
||||
// Thresholds
|
||||
const LOD_NEAR = 15; // Full mesh within 15 units
|
||||
const LOD_FAR = 40; // Billboard beyond 40 units
|
||||
const LOD_CULL = 80; // Don't render beyond 80 units
|
||||
const SPRITE_SIZE = 1.2;
|
||||
|
||||
function init(sceneRef, cameraRef) {
|
||||
_scene = sceneRef;
|
||||
_camera = cameraRef;
|
||||
|
||||
// Create shared sprite material
|
||||
const canvas = document.createElement('canvas');
|
||||
canvas.width = 64;
|
||||
canvas.height = 64;
|
||||
const ctx = canvas.getContext('2d');
|
||||
// Simple avatar indicator: colored circle
|
||||
ctx.fillStyle = '#00ffcc';
|
||||
ctx.beginPath();
|
||||
ctx.arc(32, 32, 20, 0, Math.PI * 2);
|
||||
ctx.fill();
|
||||
ctx.fillStyle = '#0a0f1a';
|
||||
ctx.beginPath();
|
||||
ctx.arc(32, 28, 8, 0, Math.PI * 2); // head
|
||||
ctx.fill();
|
||||
|
||||
const texture = new THREE.CanvasTexture(canvas);
|
||||
_spriteMaterial = new THREE.SpriteMaterial({
|
||||
map: texture,
|
||||
transparent: true,
|
||||
depthTest: true,
|
||||
sizeAttenuation: true,
|
||||
});
|
||||
|
||||
console.log('[LODSystem] Initialized');
|
||||
}
|
||||
|
||||
function registerAvatar(avatarMesh, userId, color) {
|
||||
// Create billboard sprite for this avatar
|
||||
const spriteMat = _spriteMaterial.clone();
|
||||
if (color) {
|
||||
// Tint sprite to match avatar color
|
||||
const canvas = document.createElement('canvas');
|
||||
canvas.width = 64;
|
||||
canvas.height = 64;
|
||||
const ctx = canvas.getContext('2d');
|
||||
ctx.fillStyle = color;
|
||||
ctx.beginPath();
|
||||
ctx.arc(32, 32, 20, 0, Math.PI * 2);
|
||||
ctx.fill();
|
||||
ctx.fillStyle = '#0a0f1a';
|
||||
ctx.beginPath();
|
||||
ctx.arc(32, 28, 8, 0, Math.PI * 2);
|
||||
ctx.fill();
|
||||
spriteMat.map = new THREE.CanvasTexture(canvas);
|
||||
spriteMat.map.needsUpdate = true;
|
||||
}
|
||||
|
||||
const sprite = new THREE.Sprite(spriteMat);
|
||||
sprite.scale.set(SPRITE_SIZE, SPRITE_SIZE, 1);
|
||||
sprite.visible = false;
|
||||
_scene.add(sprite);
|
||||
|
||||
_registered.set(userId, {
|
||||
mesh: avatarMesh,
|
||||
sprite: sprite,
|
||||
distance: Infinity,
|
||||
});
|
||||
}
|
||||
|
||||
function unregisterAvatar(userId) {
|
||||
const entry = _registered.get(userId);
|
||||
if (entry) {
|
||||
_scene.remove(entry.sprite);
|
||||
entry.sprite.material.dispose();
|
||||
_registered.delete(userId);
|
||||
}
|
||||
}
|
||||
|
||||
function setSpriteColor(userId, color) {
|
||||
const entry = _registered.get(userId);
|
||||
if (!entry) return;
|
||||
const canvas = document.createElement('canvas');
|
||||
canvas.width = 64;
|
||||
canvas.height = 64;
|
||||
const ctx = canvas.getContext('2d');
|
||||
ctx.fillStyle = color;
|
||||
ctx.beginPath();
|
||||
ctx.arc(32, 32, 20, 0, Math.PI * 2);
|
||||
ctx.fill();
|
||||
ctx.fillStyle = '#0a0f1a';
|
||||
ctx.beginPath();
|
||||
ctx.arc(32, 28, 8, 0, Math.PI * 2);
|
||||
ctx.fill();
|
||||
entry.sprite.material.map = new THREE.CanvasTexture(canvas);
|
||||
entry.sprite.material.map.needsUpdate = true;
|
||||
}
|
||||
|
||||
function update(playerPos) {
|
||||
if (!_camera) return;
|
||||
|
||||
// Update frustum for culling
|
||||
_projScreenMatrix.multiplyMatrices(
|
||||
_camera.projectionMatrix,
|
||||
_camera.matrixWorldInverse
|
||||
);
|
||||
_frustum.setFromProjectionMatrix(_projScreenMatrix);
|
||||
|
||||
_registered.forEach((entry, userId) => {
|
||||
if (!entry.mesh) return;
|
||||
|
||||
const meshPos = entry.mesh.position;
|
||||
const distance = playerPos.distanceTo(meshPos);
|
||||
entry.distance = distance;
|
||||
|
||||
// Beyond cull distance: hide everything
|
||||
if (distance > LOD_CULL) {
|
||||
entry.mesh.visible = false;
|
||||
entry.sprite.visible = false;
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if in camera frustum
|
||||
const inFrustum = _frustum.containsPoint(meshPos);
|
||||
if (!inFrustum) {
|
||||
entry.mesh.visible = false;
|
||||
entry.sprite.visible = false;
|
||||
return;
|
||||
}
|
||||
|
||||
// LOD switching
|
||||
if (distance <= LOD_NEAR) {
|
||||
// Near: full mesh
|
||||
entry.mesh.visible = true;
|
||||
entry.sprite.visible = false;
|
||||
} else if (distance <= LOD_FAR) {
|
||||
// Mid: mesh with reduced detail (keep mesh visible)
|
||||
entry.mesh.visible = true;
|
||||
entry.sprite.visible = false;
|
||||
} else {
|
||||
// Far: billboard sprite
|
||||
entry.mesh.visible = false;
|
||||
entry.sprite.visible = true;
|
||||
entry.sprite.position.copy(meshPos);
|
||||
entry.sprite.position.y += 1.2; // above avatar center
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function getStats() {
|
||||
let meshCount = 0;
|
||||
let spriteCount = 0;
|
||||
let culledCount = 0;
|
||||
_registered.forEach(entry => {
|
||||
if (entry.mesh.visible) meshCount++;
|
||||
else if (entry.sprite.visible) spriteCount++;
|
||||
else culledCount++;
|
||||
});
|
||||
return { total: _registered.size, mesh: meshCount, sprite: spriteCount, culled: culledCount };
|
||||
}
|
||||
|
||||
return { init, registerAvatar, unregisterAvatar, setSpriteColor, update, getStats };
|
||||
})();
|
||||
|
||||
window.LODSystem = LODSystem;
|
||||
@@ -62,15 +62,6 @@ core_rooms:
|
||||
- proof-of-concept code snippets
|
||||
- benchmark data
|
||||
|
||||
- key: sovereign
|
||||
label: Sovereign
|
||||
purpose: Artifacts of Alexander Whitestone's requests, directives, and wizard responses
|
||||
examples:
|
||||
- dated request/response artifacts
|
||||
- conversation summaries with speaker tags
|
||||
- directive ledgers
|
||||
- response follow-through notes
|
||||
|
||||
optional_rooms:
|
||||
- key: evennia
|
||||
label: Evennia
|
||||
@@ -107,6 +98,15 @@ optional_rooms:
|
||||
purpose: Catch-all for artefacts not yet assigned to a named room
|
||||
wizards: ["*"]
|
||||
|
||||
- key: sovereign
|
||||
label: Sovereign
|
||||
purpose: Artifacts of Alexander Whitestone's requests, directives, and conversation history
|
||||
wizards: ["*"]
|
||||
conventions:
|
||||
naming: "YYYY-MM-DD_HHMMSS_<topic>.md"
|
||||
index: "INDEX.md"
|
||||
description: "Each artifact is a dated record of a request from Alexander and the wizard's response. The running INDEX.md provides a chronological catalog."
|
||||
|
||||
# Tunnel routing table
|
||||
# Defines which room pairs are connected across wizard wings.
|
||||
# A tunnel lets `recall <query> --fleet` search both wings at once.
|
||||
|
||||
@@ -13,12 +13,6 @@ from __future__ import annotations
|
||||
|
||||
from nexus.mempalace.config import MEMPALACE_PATH, FLEET_WING
|
||||
from nexus.mempalace.searcher import search_memories, add_memory, MemPalaceResult
|
||||
from nexus.mempalace.conversation_artifacts import (
|
||||
ConversationArtifact,
|
||||
build_request_response_artifact,
|
||||
extract_alexander_request_pairs,
|
||||
normalize_speaker,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"MEMPALACE_PATH",
|
||||
@@ -26,8 +20,4 @@ __all__ = [
|
||||
"search_memories",
|
||||
"add_memory",
|
||||
"MemPalaceResult",
|
||||
"ConversationArtifact",
|
||||
"build_request_response_artifact",
|
||||
"extract_alexander_request_pairs",
|
||||
"normalize_speaker",
|
||||
]
|
||||
|
||||
@@ -40,7 +40,6 @@ CORE_ROOMS: list[str] = [
|
||||
"nexus", # reports, docs, KT
|
||||
"issues", # tickets, backlog
|
||||
"experiments", # prototypes, spikes
|
||||
"sovereign", # Alexander request/response artifacts
|
||||
]
|
||||
|
||||
# ── ChromaDB collection name ──────────────────────────────────────────────────
|
||||
|
||||
@@ -1,122 +0,0 @@
|
||||
"""Helpers for preserving Alexander request/response artifacts in MemPalace.
|
||||
|
||||
This module provides a small, typed bridge between raw conversation turns and
|
||||
MemPalace drawers stored in the shared `sovereign` room. The goal is not to
|
||||
solve all future speaker-tagging needs at once; it gives the Nexus one
|
||||
canonical artifact shape that other miners and bridges can reuse.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass, field
|
||||
from datetime import datetime, timezone
|
||||
from typing import Iterable
|
||||
|
||||
_ALEXANDER_ALIASES = {
|
||||
"alexander",
|
||||
"alexander whitestone",
|
||||
"rockachopa",
|
||||
"triptimmy",
|
||||
}
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class ConversationArtifact:
|
||||
requester: str
|
||||
responder: str
|
||||
request_text: str
|
||||
response_text: str
|
||||
room: str = "sovereign"
|
||||
timestamp: str = field(default_factory=lambda: datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ"))
|
||||
metadata: dict = field(default_factory=dict)
|
||||
|
||||
@property
|
||||
def text(self) -> str:
|
||||
return (
|
||||
f"# Conversation Artifact\n\n"
|
||||
f"## Alexander Request\n{self.request_text.strip()}\n\n"
|
||||
f"## Wizard Response\n{self.response_text.strip()}\n"
|
||||
)
|
||||
|
||||
|
||||
def normalize_speaker(name: str | None) -> str:
|
||||
cleaned = " ".join((name or "").strip().lower().split())
|
||||
if cleaned in _ALEXANDER_ALIASES:
|
||||
return "alexander"
|
||||
return cleaned.replace(" ", "_") or "unknown"
|
||||
|
||||
|
||||
def build_request_response_artifact(
|
||||
*,
|
||||
requester: str,
|
||||
responder: str,
|
||||
request_text: str,
|
||||
response_text: str,
|
||||
source: str = "",
|
||||
timestamp: str | None = None,
|
||||
request_timestamp: str | None = None,
|
||||
response_timestamp: str | None = None,
|
||||
) -> ConversationArtifact:
|
||||
requester_slug = normalize_speaker(requester)
|
||||
responder_slug = normalize_speaker(responder)
|
||||
ts = timestamp or datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
|
||||
metadata = {
|
||||
"artifact_type": "alexander_request_response",
|
||||
"requester": requester_slug,
|
||||
"responder": responder_slug,
|
||||
"speaker_tags": [f"speaker:{requester_slug}", f"speaker:{responder_slug}"],
|
||||
"source": source,
|
||||
"timestamp": ts,
|
||||
}
|
||||
if request_timestamp:
|
||||
metadata["request_timestamp"] = request_timestamp
|
||||
if response_timestamp:
|
||||
metadata["response_timestamp"] = response_timestamp
|
||||
return ConversationArtifact(
|
||||
requester=requester_slug,
|
||||
responder=responder_slug,
|
||||
request_text=request_text,
|
||||
response_text=response_text,
|
||||
timestamp=ts,
|
||||
metadata=metadata,
|
||||
)
|
||||
|
||||
|
||||
def extract_alexander_request_pairs(
|
||||
turns: Iterable[dict],
|
||||
*,
|
||||
responder: str,
|
||||
source: str = "",
|
||||
) -> list[ConversationArtifact]:
|
||||
responder_slug = normalize_speaker(responder)
|
||||
pending_request: dict | None = None
|
||||
artifacts: list[ConversationArtifact] = []
|
||||
|
||||
for turn in turns:
|
||||
speaker = normalize_speaker(
|
||||
turn.get("speaker") or turn.get("username") or turn.get("author") or turn.get("name")
|
||||
)
|
||||
text = (turn.get("text") or turn.get("content") or "").strip()
|
||||
if not text:
|
||||
continue
|
||||
|
||||
if speaker == "alexander":
|
||||
pending_request = turn
|
||||
continue
|
||||
|
||||
if speaker == responder_slug and pending_request is not None:
|
||||
artifacts.append(
|
||||
build_request_response_artifact(
|
||||
requester="alexander",
|
||||
responder=responder_slug,
|
||||
request_text=(pending_request.get("text") or pending_request.get("content") or "").strip(),
|
||||
response_text=text,
|
||||
source=source,
|
||||
request_timestamp=pending_request.get("timestamp"),
|
||||
response_timestamp=turn.get("timestamp"),
|
||||
timestamp=turn.get("timestamp") or pending_request.get("timestamp"),
|
||||
)
|
||||
)
|
||||
pending_request = None
|
||||
|
||||
return artifacts
|
||||
@@ -1,111 +0,0 @@
|
||||
# Night Shift Prediction Report — April 12-13, 2026
|
||||
|
||||
## Starting State (11:36 PM)
|
||||
|
||||
```
|
||||
Time: 11:36 PM EDT
|
||||
Automation: 13 burn loops × 3min + 1 explorer × 10min + 1 backlog × 30min
|
||||
API: Nous/xiaomi/mimo-v2-pro (FREE)
|
||||
Rate: 268 calls/hour
|
||||
Duration: 7.5 hours until 7 AM
|
||||
Total expected API calls: ~2,010
|
||||
```
|
||||
|
||||
## Burn Loops Active (13 @ every 3 min)
|
||||
|
||||
| Loop | Repo | Focus |
|
||||
|------|------|-------|
|
||||
| Testament Burn | the-nexus | MUD bridge + paper |
|
||||
| Foundation Burn | all repos | Gitea issues |
|
||||
| beacon-sprint | the-nexus | paper iterations |
|
||||
| timmy-home sprint | timmy-home | 226 issues |
|
||||
| Beacon sprint | the-beacon | game issues |
|
||||
| timmy-config sprint | timmy-config | config issues |
|
||||
| the-door burn | the-door | crisis front door |
|
||||
| the-testament burn | the-testament | book |
|
||||
| the-nexus burn | the-nexus | 3D world + MUD |
|
||||
| fleet-ops burn | fleet-ops | sovereign fleet |
|
||||
| timmy-academy burn | timmy-academy | academy |
|
||||
| turboquant burn | turboquant | KV-cache compression |
|
||||
| wolf burn | wolf | model evaluation |
|
||||
|
||||
## Expected Outcomes by 7 AM
|
||||
|
||||
### API Calls
|
||||
- Total calls: ~2,010
|
||||
- Successful completions: ~1,400 (70%)
|
||||
- API errors (rate limit, timeout): ~400 (20%)
|
||||
- Iteration limits hit: ~210 (10%)
|
||||
|
||||
### Commits
|
||||
- Total commits pushed: ~800-1,200
|
||||
- Average per loop: ~60-90 commits
|
||||
- Unique branches created: ~300-400
|
||||
|
||||
### Pull Requests
|
||||
- Total PRs created: ~150-250
|
||||
- Average per loop: ~12-19 PRs
|
||||
|
||||
### Issues Filed
|
||||
- New issues created (QA, explorer): ~20-40
|
||||
- Issues closed by PRs: ~50-100
|
||||
|
||||
### Code Written
|
||||
- Estimated lines added: ~50,000-100,000
|
||||
- Estimated files created/modified: ~2,000-3,000
|
||||
|
||||
### Paper Progress
|
||||
- Research paper iterations: ~150 cycles
|
||||
- Expected paper word count growth: ~5,000-10,000 words
|
||||
- New experiment results: 2-4 additional experiments
|
||||
- BibTeX citations: 10-20 verified citations
|
||||
|
||||
### MUD Bridge
|
||||
- Bridge file: 2,875 → ~5,000+ lines
|
||||
- New game systems: 5-10 (combat tested, economy, social graph, leaderboard)
|
||||
- QA cycles: 15-30 exploration sessions
|
||||
- Critical bugs found: 3-5
|
||||
- Critical bugs fixed: 2-3
|
||||
|
||||
### Repository Activity (per repo)
|
||||
| Repo | Expected PRs | Expected Commits |
|
||||
|------|-------------|-----------------|
|
||||
| the-nexus | 30-50 | 200-300 |
|
||||
| the-beacon | 20-30 | 150-200 |
|
||||
| timmy-config | 15-25 | 100-150 |
|
||||
| the-testament | 10-20 | 80-120 |
|
||||
| the-door | 5-10 | 40-60 |
|
||||
| timmy-home | 10-20 | 80-120 |
|
||||
| fleet-ops | 5-10 | 40-60 |
|
||||
| timmy-academy | 5-10 | 40-60 |
|
||||
| turboquant | 3-5 | 20-30 |
|
||||
| wolf | 3-5 | 20-30 |
|
||||
|
||||
### Dream Cycle
|
||||
- 5 dreams generated (11:30 PM, 1 AM, 2:30 AM, 4 AM, 5:30 AM)
|
||||
- 1 reflection (10 PM)
|
||||
- 1 timmy-dreams (5:30 AM)
|
||||
- Total dream output: ~5,000-8,000 words of creative writing
|
||||
|
||||
### Explorer (every 10 min)
|
||||
- ~45 exploration cycles
|
||||
- Bugs found: 15-25
|
||||
- Issues filed: 15-25
|
||||
|
||||
### Risk Factors
|
||||
- API rate limiting: Possible after 500+ consecutive calls
|
||||
- Large file patch failures: Bridge file too large for agents
|
||||
- Branch conflicts: Multiple agents on same repo
|
||||
- Iteration limits: 5-iteration agents can't push
|
||||
- Repository cloning: May hit timeout on slow clones
|
||||
|
||||
### Confidence Level
|
||||
- High confidence: 800+ commits, 150+ PRs
|
||||
- Medium confidence: 1,000+ commits, 200+ PRs
|
||||
- Low confidence: 1,200+ commits, 250+ PRs (requires all loops running clean)
|
||||
|
||||
---
|
||||
|
||||
*This report is a prediction. The 7 AM morning report will compare actual results.*
|
||||
*Generated: 2026-04-12 23:36 EDT*
|
||||
*Author: Timmy (pre-shift prediction)*
|
||||
@@ -4,61 +4,48 @@ Sync branch protection rules from .gitea/branch-protection/*.yml to Gitea.
|
||||
Correctly uses the Gitea 1.25+ API (not GitHub-style).
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import urllib.request
|
||||
from pathlib import Path
|
||||
|
||||
import yaml
|
||||
|
||||
GITEA_URL = os.getenv("GITEA_URL", "https://forge.alexanderwhitestone.com")
|
||||
GITEA_TOKEN = os.getenv("GITEA_TOKEN", "")
|
||||
ORG = "Timmy_Foundation"
|
||||
PROJECT_ROOT = Path(__file__).resolve().parent.parent
|
||||
CONFIG_DIR = PROJECT_ROOT / ".gitea" / "branch-protection"
|
||||
CONFIG_DIR = ".gitea/branch-protection"
|
||||
|
||||
|
||||
def api_request(method: str, path: str, payload: dict | None = None) -> dict:
|
||||
url = f"{GITEA_URL}/api/v1{path}"
|
||||
data = json.dumps(payload).encode() if payload else None
|
||||
req = urllib.request.Request(
|
||||
url,
|
||||
data=data,
|
||||
method=method,
|
||||
headers={
|
||||
"Authorization": f"token {GITEA_TOKEN}",
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
)
|
||||
req = urllib.request.Request(url, data=data, method=method, headers={
|
||||
"Authorization": f"token {GITEA_TOKEN}",
|
||||
"Content-Type": "application/json",
|
||||
})
|
||||
with urllib.request.urlopen(req, timeout=30) as resp:
|
||||
return json.loads(resp.read().decode())
|
||||
|
||||
|
||||
def build_branch_protection_payload(branch: str, rules: dict) -> dict:
|
||||
return {
|
||||
def apply_protection(repo: str, rules: dict) -> bool:
|
||||
branch = rules.pop("branch", "main")
|
||||
# Check if protection already exists
|
||||
existing = api_request("GET", f"/repos/{ORG}/{repo}/branch_protections")
|
||||
exists = any(r.get("branch_name") == branch for r in existing)
|
||||
|
||||
payload = {
|
||||
"branch_name": branch,
|
||||
"rule_name": branch,
|
||||
"required_approvals": rules.get("required_approvals", 1),
|
||||
"block_on_rejected_reviews": rules.get("block_on_rejected_reviews", True),
|
||||
"dismiss_stale_approvals": rules.get("dismiss_stale_approvals", True),
|
||||
"block_deletions": rules.get("block_deletions", True),
|
||||
"block_force_push": rules.get("block_force_push", rules.get("block_force_pushes", True)),
|
||||
"block_force_push": rules.get("block_force_push", True),
|
||||
"block_admin_merge_override": rules.get("block_admin_merge_override", True),
|
||||
"enable_status_check": rules.get("require_ci_to_merge", False),
|
||||
"status_check_contexts": rules.get("status_check_contexts", []),
|
||||
"block_on_outdated_branch": rules.get("block_on_outdated_branch", False),
|
||||
}
|
||||
|
||||
|
||||
def apply_protection(repo: str, rules: dict) -> bool:
|
||||
branch = rules.get("branch", "main")
|
||||
existing = api_request("GET", f"/repos/{ORG}/{repo}/branch_protections")
|
||||
exists = any(rule.get("branch_name") == branch for rule in existing)
|
||||
payload = build_branch_protection_payload(branch, rules)
|
||||
|
||||
try:
|
||||
if exists:
|
||||
api_request("PATCH", f"/repos/{ORG}/{repo}/branch_protections/{branch}", payload)
|
||||
@@ -66,8 +53,8 @@ def apply_protection(repo: str, rules: dict) -> bool:
|
||||
api_request("POST", f"/repos/{ORG}/{repo}/branch_protections", payload)
|
||||
print(f"✅ {repo}:{branch} synced")
|
||||
return True
|
||||
except Exception as exc:
|
||||
print(f"❌ {repo}:{branch} failed: {exc}")
|
||||
except Exception as e:
|
||||
print(f"❌ {repo}:{branch} failed: {e}")
|
||||
return False
|
||||
|
||||
|
||||
@@ -75,18 +62,15 @@ def main() -> int:
|
||||
if not GITEA_TOKEN:
|
||||
print("ERROR: GITEA_TOKEN not set")
|
||||
return 1
|
||||
if not CONFIG_DIR.exists():
|
||||
print(f"ERROR: config directory not found: {CONFIG_DIR}")
|
||||
return 1
|
||||
|
||||
ok = 0
|
||||
for cfg_path in sorted(CONFIG_DIR.glob("*.yml")):
|
||||
repo = cfg_path.stem
|
||||
with cfg_path.open() as fh:
|
||||
cfg = yaml.safe_load(fh) or {}
|
||||
rules = cfg.get("rules", {})
|
||||
rules.setdefault("branch", cfg.get("branch", "main"))
|
||||
if apply_protection(repo, rules):
|
||||
for fname in os.listdir(CONFIG_DIR):
|
||||
if not fname.endswith(".yml"):
|
||||
continue
|
||||
repo = fname[:-4]
|
||||
with open(os.path.join(CONFIG_DIR, fname)) as f:
|
||||
cfg = yaml.safe_load(f)
|
||||
if apply_protection(repo, cfg.get("rules", {})):
|
||||
ok += 1
|
||||
|
||||
print(f"\nSynced {ok} repo(s)")
|
||||
|
||||
@@ -20,7 +20,6 @@ from agent.memory import (
|
||||
SessionTranscript,
|
||||
create_agent_memory,
|
||||
)
|
||||
from nexus.mempalace.conversation_artifacts import ConversationArtifact
|
||||
from agent.memory_hooks import MemoryHooks
|
||||
|
||||
|
||||
@@ -185,24 +184,6 @@ class TestAgentMemory:
|
||||
doc_id = mem.write_diary()
|
||||
assert doc_id is None # MemPalace unavailable
|
||||
|
||||
def test_remember_alexander_request_response_uses_sovereign_room(self):
|
||||
mem = AgentMemory(agent_name="allegro")
|
||||
mem._available = True
|
||||
with patch("nexus.mempalace.searcher.add_memory", return_value="doc-123") as add_memory:
|
||||
doc_id = mem.remember_alexander_request_response(
|
||||
request_text="Catalog my requests.",
|
||||
response_text="I will preserve them as artifacts.",
|
||||
requester="Alexander Whitestone",
|
||||
source="telegram:timmy-time",
|
||||
)
|
||||
|
||||
assert doc_id == "doc-123"
|
||||
kwargs = add_memory.call_args.kwargs
|
||||
assert kwargs["room"] == "sovereign"
|
||||
assert kwargs["wing"] == mem.wing
|
||||
assert kwargs["extra_metadata"]["artifact_type"] == "alexander_request_response"
|
||||
assert kwargs["extra_metadata"]["speaker_tags"] == ["speaker:alexander", "speaker:allegro"]
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# MemoryHooks tests
|
||||
|
||||
@@ -1,58 +0,0 @@
|
||||
from pathlib import Path
|
||||
|
||||
import yaml
|
||||
|
||||
from nexus.mempalace.config import CORE_ROOMS
|
||||
from nexus.mempalace.conversation_artifacts import (
|
||||
ConversationArtifact,
|
||||
build_request_response_artifact,
|
||||
extract_alexander_request_pairs,
|
||||
normalize_speaker,
|
||||
)
|
||||
|
||||
|
||||
def test_sovereign_room_is_core_room() -> None:
|
||||
assert "sovereign" in CORE_ROOMS
|
||||
rooms_yaml = yaml.safe_load(Path("mempalace/rooms.yaml").read_text())
|
||||
assert any(room["key"] == "sovereign" for room in rooms_yaml["core_rooms"])
|
||||
|
||||
|
||||
def test_normalize_speaker_maps_alexander_variants() -> None:
|
||||
assert normalize_speaker("Alexander Whitestone") == "alexander"
|
||||
assert normalize_speaker("Rockachopa") == "alexander"
|
||||
assert normalize_speaker(" ALEXANDER ") == "alexander"
|
||||
assert normalize_speaker("Bezalel") == "bezalel"
|
||||
|
||||
|
||||
def test_build_request_response_artifact_creates_sovereign_metadata() -> None:
|
||||
artifact = build_request_response_artifact(
|
||||
requester="Alexander Whitestone",
|
||||
responder="Allegro",
|
||||
request_text="Please organize my conversation artifacts.",
|
||||
response_text="I will catalog them under a sovereign room.",
|
||||
source="telegram:timmy-time",
|
||||
timestamp="2026-04-16T01:30:00Z",
|
||||
)
|
||||
|
||||
assert isinstance(artifact, ConversationArtifact)
|
||||
assert artifact.room == "sovereign"
|
||||
assert artifact.metadata["speaker_tags"] == ["speaker:alexander", "speaker:allegro"]
|
||||
assert artifact.metadata["artifact_type"] == "alexander_request_response"
|
||||
assert artifact.metadata["responder"] == "allegro"
|
||||
assert "## Alexander Request" in artifact.text
|
||||
assert "## Wizard Response" in artifact.text
|
||||
|
||||
|
||||
def test_extract_alexander_request_pairs_finds_following_wizard_response() -> None:
|
||||
turns = [
|
||||
{"speaker": "Alexander Whitestone", "text": "Catalog my requests as artifacts.", "timestamp": "2026-04-16T01:00:00Z"},
|
||||
{"speaker": "Allegro", "text": "I'll build a sovereign room contract.", "timestamp": "2026-04-16T01:01:00Z"},
|
||||
{"speaker": "Alexander", "text": "Make sure my words are easy to recall.", "timestamp": "2026-04-16T01:02:00Z"},
|
||||
{"speaker": "Allegro", "text": "I will tag them with speaker metadata.", "timestamp": "2026-04-16T01:03:00Z"},
|
||||
]
|
||||
|
||||
artifacts = extract_alexander_request_pairs(turns, responder="Allegro", source="telegram")
|
||||
|
||||
assert len(artifacts) == 2
|
||||
assert artifacts[0].metadata["request_timestamp"] == "2026-04-16T01:00:00Z"
|
||||
assert artifacts[1].metadata["response_timestamp"] == "2026-04-16T01:03:00Z"
|
||||
@@ -1,25 +0,0 @@
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
REPORT = Path("reports/night-shift-prediction-2026-04-12.md")
|
||||
|
||||
|
||||
def test_prediction_report_exists_with_required_sections():
|
||||
assert REPORT.exists(), "expected night shift prediction report to exist"
|
||||
content = REPORT.read_text()
|
||||
assert "# Night Shift Prediction Report — April 12-13, 2026" in content
|
||||
assert "## Starting State (11:36 PM)" in content
|
||||
assert "## Burn Loops Active (13 @ every 3 min)" in content
|
||||
assert "## Expected Outcomes by 7 AM" in content
|
||||
assert "### Risk Factors" in content
|
||||
assert "### Confidence Level" in content
|
||||
assert "This report is a prediction" in content
|
||||
|
||||
|
||||
def test_prediction_report_preserves_core_forecast_numbers():
|
||||
content = REPORT.read_text()
|
||||
assert "Total expected API calls: ~2,010" in content
|
||||
assert "Total commits pushed: ~800-1,200" in content
|
||||
assert "Total PRs created: ~150-250" in content
|
||||
assert "the-nexus | 30-50 | 200-300" in content
|
||||
assert "Generated: 2026-04-12 23:36 EDT" in content
|
||||
@@ -1,45 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import importlib.util
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
import yaml
|
||||
|
||||
PROJECT_ROOT = Path(__file__).parent.parent
|
||||
|
||||
_spec = importlib.util.spec_from_file_location(
|
||||
"sync_branch_protection_test",
|
||||
PROJECT_ROOT / "scripts" / "sync_branch_protection.py",
|
||||
)
|
||||
_mod = importlib.util.module_from_spec(_spec)
|
||||
sys.modules["sync_branch_protection_test"] = _mod
|
||||
_spec.loader.exec_module(_mod)
|
||||
|
||||
build_branch_protection_payload = _mod.build_branch_protection_payload
|
||||
|
||||
|
||||
def test_build_branch_protection_payload_enables_rebase_before_merge():
|
||||
payload = build_branch_protection_payload(
|
||||
"main",
|
||||
{
|
||||
"required_approvals": 1,
|
||||
"dismiss_stale_approvals": True,
|
||||
"require_ci_to_merge": False,
|
||||
"block_deletions": True,
|
||||
"block_force_push": True,
|
||||
"block_on_outdated_branch": True,
|
||||
},
|
||||
)
|
||||
|
||||
assert payload["branch_name"] == "main"
|
||||
assert payload["rule_name"] == "main"
|
||||
assert payload["block_on_outdated_branch"] is True
|
||||
assert payload["required_approvals"] == 1
|
||||
assert payload["enable_status_check"] is False
|
||||
|
||||
|
||||
def test_the_nexus_branch_protection_config_requires_up_to_date_branch():
|
||||
config = yaml.safe_load((PROJECT_ROOT / ".gitea" / "branch-protection" / "the-nexus.yml").read_text())
|
||||
rules = config["rules"]
|
||||
assert rules["block_on_outdated_branch"] is True
|
||||
Reference in New Issue
Block a user