Compare commits
6 Commits
burn/1460-
...
fix/879
| Author | SHA1 | Date | |
|---|---|---|---|
| fe0005974f | |||
| 576c24f814 | |||
| 82f04c9675 | |||
| f60c4c175f | |||
| bd0497b998 | |||
|
|
4ab84a59ab |
@@ -1,72 +0,0 @@
|
||||
# .gitea/workflows/duplicate-pr-check.yml
|
||||
# CI workflow to check for duplicate PRs
|
||||
|
||||
name: Check for Duplicate PRs
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened]
|
||||
|
||||
jobs:
|
||||
check-duplicates:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
# No additional dependencies needed
|
||||
|
||||
- name: Check for duplicate PRs
|
||||
env:
|
||||
GITEA_TOKEN: ${{ secrets.GITEA_TOKEN }}
|
||||
run: |
|
||||
# Extract issue number from PR title or branch name
|
||||
PR_TITLE="${{ github.event.pull_request.title }}"
|
||||
BRANCH_NAME="${{ github.head_ref }}"
|
||||
|
||||
# Try to extract issue number from title or branch
|
||||
ISSUE_NUM=$(echo "$PR_TITLE" | grep -oE '#[0-9]+' | head -1 | tr -d '#')
|
||||
|
||||
if [ -z "$ISSUE_NUM" ]; then
|
||||
ISSUE_NUM=$(echo "$BRANCH_NAME" | grep -oE '[0-9]+' | head -1)
|
||||
fi
|
||||
|
||||
if [ -z "$ISSUE_NUM" ]; then
|
||||
echo "No issue number found in PR title or branch name"
|
||||
echo "Skipping duplicate check"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "Checking for duplicate PRs for issue #$ISSUE_NUM"
|
||||
|
||||
# Save token to file for the script
|
||||
echo "$GITEA_TOKEN" > /tmp/gitea_token.txt
|
||||
export TOKEN_PATH=/tmp/gitea_token.txt
|
||||
|
||||
# Run the duplicate checker
|
||||
python bin/duplicate_pr_prevention.py --repo the-nexus --issue "$ISSUE_NUM" --check
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
echo ""
|
||||
echo "❌ Duplicate PRs detected for issue #$ISSUE_NUM"
|
||||
echo "This PR should be closed in favor of an existing one."
|
||||
echo ""
|
||||
echo "To see details, run:"
|
||||
echo " python bin/duplicate_pr_prevention.py --repo the-nexus --issue $ISSUE_NUM --report"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "✅ No duplicate PRs found"
|
||||
|
||||
- name: Clean up
|
||||
if: always()
|
||||
run: |
|
||||
rm -f /tmp/gitea_token.txt
|
||||
@@ -1,241 +0,0 @@
|
||||
# Duplicate PR Prevention System
|
||||
|
||||
**Issue:** #1460 - [META] I keep creating duplicate PRs for issue #1128
|
||||
**Solution:** Comprehensive prevention system with tools, hooks, and CI checks
|
||||
|
||||
## Problem Statement
|
||||
|
||||
Issue #1460 describes a meta-problem: creating 7 duplicate PRs for issue #1128, which was itself about cleaning up duplicate PRs. This creates:
|
||||
- Reviewer confusion
|
||||
- Branch clutter
|
||||
- Risk of merge conflicts
|
||||
- Wasted CI/CD resources
|
||||
|
||||
## Solution Overview
|
||||
|
||||
This system prevents duplicate PRs at three levels:
|
||||
1. **Local Prevention** — Git hooks that check before pushing
|
||||
2. **CI/CD Prevention** — Workflows that check when PRs are created
|
||||
3. **Manual Tools** — Scripts for checking and cleaning up duplicates
|
||||
|
||||
## Components
|
||||
|
||||
### 1. `bin/duplicate_pr_prevention.py`
|
||||
Main prevention script with three modes:
|
||||
|
||||
**Check for duplicates:**
|
||||
```bash
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --check
|
||||
```
|
||||
|
||||
**Clean up duplicates:**
|
||||
```bash
|
||||
# Dry run (see what would be closed)
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --cleanup --dry-run
|
||||
|
||||
# Actually close duplicates
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --cleanup
|
||||
```
|
||||
|
||||
**Generate report:**
|
||||
```bash
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --report
|
||||
```
|
||||
|
||||
### 2. `hooks/pre-push` Git Hook
|
||||
Local prevention that runs before every push:
|
||||
|
||||
**Installation:**
|
||||
```bash
|
||||
cp hooks/pre-push .git/hooks/pre-push
|
||||
chmod +x .git/hooks/pre-push
|
||||
```
|
||||
|
||||
**How it works:**
|
||||
1. Extracts issue number from branch name (e.g., `fix/1128-something` → `1128`)
|
||||
2. Checks for existing PRs for that issue
|
||||
3. Blocks push if duplicates found
|
||||
4. Provides instructions for resolution
|
||||
|
||||
### 3. `.gitea/workflows/duplicate-pr-check.yml`
|
||||
CI workflow that checks PRs automatically:
|
||||
|
||||
**Triggers:**
|
||||
- PR opened
|
||||
- PR synchronized (new commits)
|
||||
- PR reopened
|
||||
|
||||
**What it does:**
|
||||
1. Extracts issue number from PR title or branch name
|
||||
2. Checks for existing PRs
|
||||
3. Fails CI if duplicates found
|
||||
4. Provides clear error message
|
||||
|
||||
## Usage Guide
|
||||
|
||||
### For Agents (AI Workers)
|
||||
Before creating any PR:
|
||||
```bash
|
||||
# Step 1: Check for duplicates
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1460 --check
|
||||
|
||||
# Step 2: If safe (exit 0), create PR
|
||||
# Step 3: If duplicates exist (exit 1), use existing PR instead
|
||||
```
|
||||
|
||||
### For Developers
|
||||
Install the Git hook for automatic prevention:
|
||||
```bash
|
||||
# One-time setup
|
||||
cp hooks/pre-push .git/hooks/pre-push
|
||||
chmod +x .git/hooks/pre-push
|
||||
|
||||
# Now git push will automatically check for duplicates
|
||||
git push # Will be blocked if duplicates exist
|
||||
```
|
||||
|
||||
### For CI/CD
|
||||
The workflow runs automatically on all PRs. No setup needed.
|
||||
|
||||
## Examples
|
||||
|
||||
### Check for duplicates:
|
||||
```bash
|
||||
$ python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --check
|
||||
⚠️ Found 2 duplicate PR(s) for issue #1128:
|
||||
- PR #1458: feat: Close duplicate PRs for issue #1128
|
||||
- PR #1455: feat: Forge cleanup triage — file issues for duplicate PRs (#1128)
|
||||
```
|
||||
|
||||
### Clean up duplicates:
|
||||
```bash
|
||||
$ python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --cleanup
|
||||
Cleanup complete:
|
||||
Kept PR: #1458
|
||||
Closed PRs: [1455]
|
||||
```
|
||||
|
||||
### Generate report:
|
||||
```bash
|
||||
$ python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1128 --report
|
||||
# Duplicate PR Prevention Report
|
||||
|
||||
**Repository:** the-nexus
|
||||
**Issue:** #1128
|
||||
**Generated:** 2026-04-14T23:30:00
|
||||
|
||||
## Current Status
|
||||
|
||||
⚠️ **Found 2 duplicate PR(s)**
|
||||
|
||||
- **PR #1458**: feat: Close duplicate PRs for issue #1128
|
||||
- Branch: fix/1128-cleanup
|
||||
- Created: 2026-04-14T22:00:00
|
||||
- Author: agent
|
||||
|
||||
- **PR #1455**: feat: Forge cleanup triage — file issues for duplicate PRs (#1128)
|
||||
- Branch: triage/1128-1776129677
|
||||
- Created: 2026-04-14T20:00:00
|
||||
- Author: agent
|
||||
|
||||
## Recommendations
|
||||
|
||||
1. **Review existing PRs** — Check which one is the best solution
|
||||
2. **Keep the newest** — Usually the most up-to-date
|
||||
3. **Close duplicates** — Use cleanup_duplicate_prs.py
|
||||
4. **Prevent future duplicates** — Use check_duplicate_pr.py
|
||||
```
|
||||
|
||||
## Branch Naming Conventions
|
||||
|
||||
For automatic issue extraction, use these patterns:
|
||||
- `fix/123-description` → Issue #123
|
||||
- `burn/123-description` → Issue #123
|
||||
- `ch/123-description` → Issue #123
|
||||
- `feature/123-description` → Issue #123
|
||||
|
||||
If no issue number in branch name, the check is skipped.
|
||||
|
||||
## Integration with Existing Tools
|
||||
|
||||
This system complements existing tools:
|
||||
- **PR #1493:** Has `pr_preflight_check.py` — similar functionality
|
||||
- **PR #1497:** Has `check_duplicate_pr.py` — similar functionality
|
||||
|
||||
This system provides additional features:
|
||||
1. **Git hooks** for local prevention
|
||||
2. **CI workflows** for automated checking
|
||||
3. **Cleanup tools** for closing duplicates
|
||||
4. **Comprehensive reporting**
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Hook not working?
|
||||
```bash
|
||||
# Check if hook is installed
|
||||
ls -la .git/hooks/pre-push
|
||||
|
||||
# Make sure it's executable
|
||||
chmod +x .git/hooks/pre-push
|
||||
|
||||
# Test it manually
|
||||
./.git/hooks/pre-push
|
||||
```
|
||||
|
||||
### CI failing?
|
||||
1. Check if `GITEA_TOKEN` secret is set
|
||||
2. Verify issue number can be extracted from PR title/branch
|
||||
3. Check workflow logs for details
|
||||
|
||||
### False positives?
|
||||
If the script incorrectly identifies duplicates:
|
||||
1. Check PR titles and bodies for issue references
|
||||
2. Use `--report` to see what's being detected
|
||||
3. Manually close incorrect PRs if needed
|
||||
|
||||
## Prevention Strategy
|
||||
|
||||
### 1. **Always Check First**
|
||||
```bash
|
||||
# Before creating any PR
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1460 --check
|
||||
```
|
||||
|
||||
### 2. **Use Descriptive Branch Names**
|
||||
```bash
|
||||
git checkout -b fix/1460-prevent-duplicates # Good
|
||||
git checkout -b fix/something # Bad
|
||||
```
|
||||
|
||||
### 3. **Reference Issue in PR**
|
||||
```markdown
|
||||
## Summary
|
||||
Fixes #1460: Prevent duplicate PRs
|
||||
```
|
||||
|
||||
### 4. **Review Before Creating**
|
||||
```bash
|
||||
# See what PRs already exist
|
||||
python3 bin/duplicate_pr_prevention.py --repo the-nexus --issue 1460 --report
|
||||
```
|
||||
|
||||
## Related Issues
|
||||
|
||||
- **Issue #1460:** This implementation
|
||||
- **Issue #1128:** Original issue that had 7 duplicate PRs
|
||||
- **Issue #1449:** [URGENT] 5 duplicate PRs for issue #1128 need cleanup
|
||||
- **Issue #1474:** [META] Still creating duplicate PRs for issue #1128 despite cleanup
|
||||
- **Issue #1480:** [META] 4th duplicate PR for issue #1128 — need intervention
|
||||
|
||||
## Files
|
||||
|
||||
```
|
||||
bin/duplicate_pr_prevention.py # Main prevention script
|
||||
hooks/pre-push # Git hook for local prevention
|
||||
.gitea/workflows/duplicate-pr-check.yml # CI workflow
|
||||
DUPLICATE_PR_PREVENTION.md # This documentation
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
Part of the Timmy Foundation project.
|
||||
@@ -1,230 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Duplicate PR Prevention System for Timmy Foundation
|
||||
Prevents the issue described in #1460: creating duplicate PRs for the same issue.
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import urllib.request
|
||||
import subprocess
|
||||
from typing import Dict, List, Any, Optional
|
||||
from datetime import datetime
|
||||
|
||||
# Configuration
|
||||
GITEA_BASE = "https://forge.alexanderwhitestone.com/api/v1"
|
||||
TOKEN_PATH = os.path.expanduser("~/.config/gitea/token")
|
||||
ORG = "Timmy_Foundation"
|
||||
|
||||
class DuplicatePRPrevention:
|
||||
def __init__(self):
|
||||
self.token = self._load_token()
|
||||
|
||||
def _load_token(self) -> str:
|
||||
"""Load Gitea API token."""
|
||||
try:
|
||||
with open(TOKEN_PATH, "r") as f:
|
||||
return f.read().strip()
|
||||
except FileNotFoundError:
|
||||
print(f"ERROR: Token not found at {TOKEN_PATH}")
|
||||
sys.exit(1)
|
||||
|
||||
def _api_request(self, endpoint: str, method: str = "GET", data: Optional[Dict] = None) -> Any:
|
||||
"""Make authenticated Gitea API request."""
|
||||
url = f"{GITEA_BASE}{endpoint}"
|
||||
headers = {
|
||||
"Authorization": f"token {self.token}",
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
req = urllib.request.Request(url, headers=headers, method=method)
|
||||
if data:
|
||||
req.data = json.dumps(data).encode()
|
||||
|
||||
try:
|
||||
with urllib.request.urlopen(req) as resp:
|
||||
if resp.status == 204: # No content
|
||||
return {"status": "success", "code": resp.status}
|
||||
return json.loads(resp.read())
|
||||
except urllib.error.HTTPError as e:
|
||||
error_body = e.read().decode() if e.fp else "No error body"
|
||||
print(f"API Error {e.code}: {error_body}")
|
||||
return {"error": e.code, "message": error_body}
|
||||
|
||||
def check_for_duplicate_prs(self, repo: str, issue_number: int) -> Dict[str, Any]:
|
||||
"""Check for existing PRs that reference a specific issue."""
|
||||
# Get all open PRs
|
||||
endpoint = f"/repos/{ORG}/{repo}/pulls?state=open"
|
||||
prs = self._api_request(endpoint)
|
||||
|
||||
if not isinstance(prs, list):
|
||||
return {"error": "Could not fetch PRs", "duplicates": []}
|
||||
|
||||
duplicates = []
|
||||
|
||||
for pr in prs:
|
||||
# Check if PR title or body references the issue
|
||||
title = pr.get('title', '').lower()
|
||||
body = pr.get('body', '').lower() if pr.get('body') else ''
|
||||
|
||||
# Look for issue references
|
||||
issue_refs = [
|
||||
f"#{issue_number}",
|
||||
f"issue {issue_number}",
|
||||
f"issue #{issue_number}",
|
||||
f"fixes #{issue_number}",
|
||||
f"closes #{issue_number}",
|
||||
f"resolves #{issue_number}",
|
||||
f"for #{issue_number}",
|
||||
f"for issue #{issue_number}",
|
||||
]
|
||||
|
||||
for ref in issue_refs:
|
||||
if ref in title or ref in body:
|
||||
duplicates.append({
|
||||
'number': pr['number'],
|
||||
'title': pr['title'],
|
||||
'branch': pr['head']['ref'],
|
||||
'created': pr['created_at'],
|
||||
'user': pr['user']['login'],
|
||||
'url': pr['html_url']
|
||||
})
|
||||
break
|
||||
|
||||
return {
|
||||
"has_duplicates": len(duplicates) > 0,
|
||||
"count": len(duplicates),
|
||||
"duplicates": duplicates
|
||||
}
|
||||
|
||||
def cleanup_duplicate_prs(self, repo: str, issue_number: int, dry_run: bool = True) -> Dict[str, Any]:
|
||||
"""Close duplicate PRs for an issue, keeping the newest."""
|
||||
duplicates = self.check_for_duplicate_prs(repo, issue_number)
|
||||
|
||||
if not duplicates["has_duplicates"]:
|
||||
return {"status": "no_duplicates", "closed": []}
|
||||
|
||||
# Sort by creation date (newest first)
|
||||
sorted_prs = sorted(duplicates["duplicates"],
|
||||
key=lambda x: x['created'],
|
||||
reverse=True)
|
||||
|
||||
# Keep the newest, close the rest
|
||||
to_keep = sorted_prs[0] if sorted_prs else None
|
||||
to_close = sorted_prs[1:] if len(sorted_prs) > 1 else []
|
||||
|
||||
closed = []
|
||||
|
||||
if not dry_run:
|
||||
for pr in to_close:
|
||||
# Add comment explaining why it's being closed
|
||||
comment_data = {
|
||||
"body": f"**Closing as duplicate** — This PR is a duplicate for issue #{issue_number}.\n\n"
|
||||
f"Keeping PR #{to_keep['number']} instead.\n\n"
|
||||
f"This is an automated cleanup to prevent duplicate PRs.\n"
|
||||
f"See issue #1460 for context."
|
||||
}
|
||||
|
||||
# Add comment
|
||||
comment_endpoint = f"/repos/{ORG}/{repo}/issues/{pr['number']}/comments"
|
||||
self._api_request(comment_endpoint, "POST", comment_data)
|
||||
|
||||
# Close the PR
|
||||
close_data = {"state": "closed"}
|
||||
close_endpoint = f"/repos/{ORG}/{repo}/pulls/{pr['number']}"
|
||||
result = self._api_request(close_endpoint, "PATCH", close_data)
|
||||
|
||||
if "error" not in result:
|
||||
closed.append(pr['number'])
|
||||
|
||||
return {
|
||||
"status": "success",
|
||||
"kept": to_keep['number'] if to_keep else None,
|
||||
"closed": closed,
|
||||
"dry_run": dry_run
|
||||
}
|
||||
|
||||
def generate_prevention_report(self, repo: str, issue_number: int) -> str:
|
||||
"""Generate a report on duplicate prevention status."""
|
||||
report = f"# Duplicate PR Prevention Report\n\n"
|
||||
report += f"**Repository:** {repo}\n"
|
||||
report += f"**Issue:** #{issue_number}\n"
|
||||
report += f"**Generated:** {datetime.now().isoformat()}\n\n"
|
||||
|
||||
# Check for duplicates
|
||||
duplicates = self.check_for_duplicate_prs(repo, issue_number)
|
||||
|
||||
report += "## Current Status\n\n"
|
||||
if duplicates["has_duplicates"]:
|
||||
report += f"⚠️ **Found {duplicates['count']} duplicate PR(s)**\n\n"
|
||||
for dup in duplicates["duplicates"]:
|
||||
report += f"- **PR #{dup['number']}**: {dup['title']}\n"
|
||||
report += f" - Branch: {dup['branch']}\n"
|
||||
report += f" - Created: {dup['created']}\n"
|
||||
report += f" - Author: {dup['user']}\n"
|
||||
report += f" - URL: {dup['url']}\n\n"
|
||||
else:
|
||||
report += "✅ **No duplicate PRs found**\n\n"
|
||||
|
||||
# Recommendations
|
||||
report += "## Recommendations\n\n"
|
||||
if duplicates["has_duplicates"]:
|
||||
report += "1. **Review existing PRs** — Check which one is the best solution\n"
|
||||
report += "2. **Keep the newest** — Usually the most up-to-date\n"
|
||||
report += "3. **Close duplicates** — Use cleanup_duplicate_prs.py\n"
|
||||
report += "4. **Prevent future duplicates** — Use check_duplicate_pr.py\n"
|
||||
else:
|
||||
report += "1. **Safe to create PR** — No duplicates exist\n"
|
||||
report += "2. **Use prevention tools** — Always check before creating PRs\n"
|
||||
report += "3. **Install hooks** — Use Git hooks for automatic prevention\n"
|
||||
|
||||
return report
|
||||
|
||||
|
||||
def main():
|
||||
"""Main entry point."""
|
||||
import argparse
|
||||
|
||||
parser = argparse.ArgumentParser(description="Duplicate PR Prevention System")
|
||||
parser.add_argument("--repo", required=True, help="Repository name (e.g., the-nexus)")
|
||||
parser.add_argument("--issue", required=True, type=int, help="Issue number")
|
||||
parser.add_argument("--check", action="store_true", help="Check for duplicates")
|
||||
parser.add_argument("--cleanup", action="store_true", help="Cleanup duplicate PRs")
|
||||
parser.add_argument("--dry-run", action="store_true", help="Dry run for cleanup")
|
||||
parser.add_argument("--report", action="store_true", help="Generate report")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
prevention = DuplicatePRPrevention()
|
||||
|
||||
if args.check:
|
||||
result = prevention.check_for_duplicate_prs(args.repo, args.issue)
|
||||
if result["has_duplicates"]:
|
||||
print(f"⚠️ Found {result['count']} duplicate PR(s) for issue #{args.issue}:")
|
||||
for dup in result["duplicates"]:
|
||||
print(f" - PR #{dup['number']}: {dup['title']}")
|
||||
sys.exit(1)
|
||||
else:
|
||||
print(f"✅ No duplicate PRs found for issue #{args.issue}")
|
||||
sys.exit(0)
|
||||
|
||||
elif args.cleanup:
|
||||
result = prevention.cleanup_duplicate_prs(args.repo, args.issue, args.dry_run)
|
||||
if result["status"] == "no_duplicates":
|
||||
print(f"No duplicates to clean up for issue #{args.issue}")
|
||||
else:
|
||||
print(f"Cleanup {'(dry run) ' if args.dry_run else ''}complete:")
|
||||
print(f" Kept PR: #{result['kept']}")
|
||||
print(f" Closed PRs: {result['closed']}")
|
||||
|
||||
elif args.report:
|
||||
report = prevention.generate_prevention_report(args.repo, args.issue)
|
||||
print(report)
|
||||
|
||||
else:
|
||||
parser.print_help()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
15
config/lazarus_pit.json
Normal file
15
config/lazarus_pit.json
Normal file
@@ -0,0 +1,15 @@
|
||||
{
|
||||
"missions_root": "/var/missions",
|
||||
"heartbeat_job": "lazarus_pit",
|
||||
"heartbeat_interval_seconds": 60,
|
||||
"stale_after_seconds": 180,
|
||||
"required_subdirs": [
|
||||
"meta",
|
||||
"config",
|
||||
"state",
|
||||
"logs",
|
||||
"artifacts",
|
||||
"worktree"
|
||||
],
|
||||
"heartbeat_file": "state/heartbeat.json"
|
||||
}
|
||||
68
docs/mission-cell-spec.md
Normal file
68
docs/mission-cell-spec.md
Normal file
@@ -0,0 +1,68 @@
|
||||
# Mission Cell Directory Spec
|
||||
|
||||
This document defines the foundational Mission Cell filesystem contract for Lazarus Pit.
|
||||
It is a grounded M6 foundation slice, not the full Mission Cell runtime.
|
||||
|
||||
Root layout:
|
||||
- `/var/missions/<uuid>/`
|
||||
|
||||
Required subdirectories:
|
||||
- `meta/`
|
||||
- `config/`
|
||||
- `state/`
|
||||
- `logs/`
|
||||
- `artifacts/`
|
||||
- `worktree/`
|
||||
|
||||
Required seed files:
|
||||
- `meta/mission.json`
|
||||
- `config/cell.json`
|
||||
- `state/heartbeat.json`
|
||||
- `logs/daemon.log`
|
||||
|
||||
## Intent of each path
|
||||
|
||||
- `meta/mission.json`
|
||||
- durable mission identity and lifecycle metadata
|
||||
- includes `mission_id`, `created_at`, and current status
|
||||
- `config/cell.json`
|
||||
- local cell wiring
|
||||
- points to the worktree, artifacts directory, and heartbeat file
|
||||
- `state/heartbeat.json`
|
||||
- latest cell heartbeat timestamp and state
|
||||
- consumed by Lazarus Pit scans for healthy vs stale cell classification
|
||||
- `logs/daemon.log`
|
||||
- daemon-local operational log target
|
||||
- `artifacts/`
|
||||
- handoff packets, reports, checkpoints, and mission outputs
|
||||
- `worktree/`
|
||||
- mission-specific checked-out repository workspace
|
||||
|
||||
## Lazarus Pit daemon skeleton
|
||||
|
||||
`scripts/lazarus_pit.py` provides the foundation daemon behavior:
|
||||
- initialize a Mission Cell scaffold with `--init-cell <uuid>`
|
||||
- scan all cells under the configured missions root
|
||||
- classify cells as `healthy`, `stale`, `incomplete`, or `uninitialized`
|
||||
- emit a daemon heartbeat through the existing cron heartbeat writer
|
||||
- output a JSON health report for higher-level watchers
|
||||
|
||||
Default config lives at:
|
||||
- `config/lazarus_pit.json`
|
||||
|
||||
## Example bootstrap
|
||||
|
||||
```bash
|
||||
python3 scripts/lazarus_pit.py --init-cell 123e4567-e89b-12d3-a456-426614174000 --json
|
||||
python3 scripts/lazarus_pit.py --write-heartbeat --json
|
||||
```
|
||||
|
||||
## What remains for full #879 completion
|
||||
|
||||
This slice does not yet complete the whole issue.
|
||||
Still open:
|
||||
- health heartbeat endpoint on existing wizard gateways
|
||||
- Gitea mission proposal issue template
|
||||
- live daemon service wiring / long-running supervisor integration
|
||||
|
||||
Refs: #879
|
||||
@@ -1,59 +0,0 @@
|
||||
#!/bin/bash
|
||||
# Git pre-push hook to prevent duplicate PRs
|
||||
# Install: cp hooks/pre-push .git/hooks/pre-push && chmod +x .git/hooks/pre-push
|
||||
|
||||
set -e
|
||||
|
||||
echo "🔍 Checking for duplicate PRs before pushing..."
|
||||
|
||||
# Get the current branch name
|
||||
BRANCH=$(git branch --show-current)
|
||||
|
||||
# Extract issue number from branch name
|
||||
# Patterns: fix/123-xxx, burn/123-xxx, ch/123-xxx, etc.
|
||||
ISSUE_NUM=$(echo "$BRANCH" | grep -oE '[0-9]+' | head -1)
|
||||
|
||||
if [ -z "$ISSUE_NUM" ]; then
|
||||
echo "ℹ️ No issue number found in branch name: $BRANCH"
|
||||
echo " Skipping duplicate check..."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "📋 Found issue #$ISSUE_NUM in branch name"
|
||||
|
||||
# Get repository name from git remote
|
||||
REMOTE_URL=$(git config --get remote.origin.url)
|
||||
if [[ "$REMOTE_URL" == *"Timmy_Foundation/"* ]]; then
|
||||
REPO=$(echo "$REMOTE_URL" | sed 's/.*Timmy_Foundation\///' | sed 's/\.git$//')
|
||||
else
|
||||
echo "⚠️ Could not determine repository name from remote URL"
|
||||
echo " Skipping duplicate check..."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo "📦 Repository: $REPO"
|
||||
|
||||
# Run the duplicate checker
|
||||
if [ -f "bin/duplicate_pr_prevention.py" ]; then
|
||||
python3 bin/duplicate_pr_prevention.py --repo "$REPO" --issue "$ISSUE_NUM" --check
|
||||
|
||||
if [ $? -ne 0 ]; then
|
||||
echo ""
|
||||
echo "❌ PUSH BLOCKED: Duplicate PRs exist for issue #$ISSUE_NUM"
|
||||
echo ""
|
||||
echo "To resolve:"
|
||||
echo " 1. Review existing PRs: python3 bin/duplicate_pr_prevention.py --repo $REPO --issue $ISSUE_NUM --report"
|
||||
echo " 2. Use existing PR instead of creating a new one"
|
||||
echo " 3. Or clean up duplicates: python3 bin/duplicate_pr_prevention.py --repo $REPO --issue $ISSUE_NUM --cleanup"
|
||||
echo ""
|
||||
echo "To bypass (NOT recommended):"
|
||||
echo " git push --no-verify"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "⚠️ duplicate_pr_prevention.py not found in bin/"
|
||||
echo " Skipping duplicate check..."
|
||||
fi
|
||||
|
||||
echo "✅ No duplicate PRs found. Proceeding with push..."
|
||||
exit 0
|
||||
111
reports/night-shift-prediction-2026-04-12.md
Normal file
111
reports/night-shift-prediction-2026-04-12.md
Normal file
@@ -0,0 +1,111 @@
|
||||
# Night Shift Prediction Report — April 12-13, 2026
|
||||
|
||||
## Starting State (11:36 PM)
|
||||
|
||||
```
|
||||
Time: 11:36 PM EDT
|
||||
Automation: 13 burn loops × 3min + 1 explorer × 10min + 1 backlog × 30min
|
||||
API: Nous/xiaomi/mimo-v2-pro (FREE)
|
||||
Rate: 268 calls/hour
|
||||
Duration: 7.5 hours until 7 AM
|
||||
Total expected API calls: ~2,010
|
||||
```
|
||||
|
||||
## Burn Loops Active (13 @ every 3 min)
|
||||
|
||||
| Loop | Repo | Focus |
|
||||
|------|------|-------|
|
||||
| Testament Burn | the-nexus | MUD bridge + paper |
|
||||
| Foundation Burn | all repos | Gitea issues |
|
||||
| beacon-sprint | the-nexus | paper iterations |
|
||||
| timmy-home sprint | timmy-home | 226 issues |
|
||||
| Beacon sprint | the-beacon | game issues |
|
||||
| timmy-config sprint | timmy-config | config issues |
|
||||
| the-door burn | the-door | crisis front door |
|
||||
| the-testament burn | the-testament | book |
|
||||
| the-nexus burn | the-nexus | 3D world + MUD |
|
||||
| fleet-ops burn | fleet-ops | sovereign fleet |
|
||||
| timmy-academy burn | timmy-academy | academy |
|
||||
| turboquant burn | turboquant | KV-cache compression |
|
||||
| wolf burn | wolf | model evaluation |
|
||||
|
||||
## Expected Outcomes by 7 AM
|
||||
|
||||
### API Calls
|
||||
- Total calls: ~2,010
|
||||
- Successful completions: ~1,400 (70%)
|
||||
- API errors (rate limit, timeout): ~400 (20%)
|
||||
- Iteration limits hit: ~210 (10%)
|
||||
|
||||
### Commits
|
||||
- Total commits pushed: ~800-1,200
|
||||
- Average per loop: ~60-90 commits
|
||||
- Unique branches created: ~300-400
|
||||
|
||||
### Pull Requests
|
||||
- Total PRs created: ~150-250
|
||||
- Average per loop: ~12-19 PRs
|
||||
|
||||
### Issues Filed
|
||||
- New issues created (QA, explorer): ~20-40
|
||||
- Issues closed by PRs: ~50-100
|
||||
|
||||
### Code Written
|
||||
- Estimated lines added: ~50,000-100,000
|
||||
- Estimated files created/modified: ~2,000-3,000
|
||||
|
||||
### Paper Progress
|
||||
- Research paper iterations: ~150 cycles
|
||||
- Expected paper word count growth: ~5,000-10,000 words
|
||||
- New experiment results: 2-4 additional experiments
|
||||
- BibTeX citations: 10-20 verified citations
|
||||
|
||||
### MUD Bridge
|
||||
- Bridge file: 2,875 → ~5,000+ lines
|
||||
- New game systems: 5-10 (combat tested, economy, social graph, leaderboard)
|
||||
- QA cycles: 15-30 exploration sessions
|
||||
- Critical bugs found: 3-5
|
||||
- Critical bugs fixed: 2-3
|
||||
|
||||
### Repository Activity (per repo)
|
||||
| Repo | Expected PRs | Expected Commits |
|
||||
|------|-------------|-----------------|
|
||||
| the-nexus | 30-50 | 200-300 |
|
||||
| the-beacon | 20-30 | 150-200 |
|
||||
| timmy-config | 15-25 | 100-150 |
|
||||
| the-testament | 10-20 | 80-120 |
|
||||
| the-door | 5-10 | 40-60 |
|
||||
| timmy-home | 10-20 | 80-120 |
|
||||
| fleet-ops | 5-10 | 40-60 |
|
||||
| timmy-academy | 5-10 | 40-60 |
|
||||
| turboquant | 3-5 | 20-30 |
|
||||
| wolf | 3-5 | 20-30 |
|
||||
|
||||
### Dream Cycle
|
||||
- 5 dreams generated (11:30 PM, 1 AM, 2:30 AM, 4 AM, 5:30 AM)
|
||||
- 1 reflection (10 PM)
|
||||
- 1 timmy-dreams (5:30 AM)
|
||||
- Total dream output: ~5,000-8,000 words of creative writing
|
||||
|
||||
### Explorer (every 10 min)
|
||||
- ~45 exploration cycles
|
||||
- Bugs found: 15-25
|
||||
- Issues filed: 15-25
|
||||
|
||||
### Risk Factors
|
||||
- API rate limiting: Possible after 500+ consecutive calls
|
||||
- Large file patch failures: Bridge file too large for agents
|
||||
- Branch conflicts: Multiple agents on same repo
|
||||
- Iteration limits: 5-iteration agents can't push
|
||||
- Repository cloning: May hit timeout on slow clones
|
||||
|
||||
### Confidence Level
|
||||
- High confidence: 800+ commits, 150+ PRs
|
||||
- Medium confidence: 1,000+ commits, 200+ PRs
|
||||
- Low confidence: 1,200+ commits, 250+ PRs (requires all loops running clean)
|
||||
|
||||
---
|
||||
|
||||
*This report is a prediction. The 7 AM morning report will compare actual results.*
|
||||
*Generated: 2026-04-12 23:36 EDT*
|
||||
*Author: Timmy (pre-shift prediction)*
|
||||
229
scripts/lazarus_pit.py
Normal file
229
scripts/lazarus_pit.py
Normal file
@@ -0,0 +1,229 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Lazarus Pit daemon skeleton for Mission Cell foundations.
|
||||
|
||||
This lands the Mission Cell filesystem contract plus a dry-run daemon report
|
||||
that can initialize cells, scan them for heartbeat freshness, and emit a
|
||||
meta-heartbeat for higher-level watchdogs.
|
||||
|
||||
Refs: #879
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import importlib.util
|
||||
import json
|
||||
import sys
|
||||
import time
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
PROJECT_ROOT = Path(__file__).resolve().parent.parent
|
||||
|
||||
_hb_spec = importlib.util.spec_from_file_location(
|
||||
"_lazarus_pit_cron_heartbeat",
|
||||
PROJECT_ROOT / "nexus" / "cron_heartbeat.py",
|
||||
)
|
||||
_hb = importlib.util.module_from_spec(_hb_spec)
|
||||
sys.modules["_lazarus_pit_cron_heartbeat"] = _hb
|
||||
_hb_spec.loader.exec_module(_hb)
|
||||
write_cron_heartbeat = _hb.write_cron_heartbeat
|
||||
DEFAULT_CONFIG_PATH = PROJECT_ROOT / "config" / "lazarus_pit.json"
|
||||
DEFAULT_REQUIRED_SUBDIRS = ["meta", "config", "state", "logs", "artifacts", "worktree"]
|
||||
|
||||
|
||||
def load_config(path: str | Path = DEFAULT_CONFIG_PATH) -> dict[str, Any]:
|
||||
config_path = Path(path)
|
||||
defaults = {
|
||||
"missions_root": "/var/missions",
|
||||
"heartbeat_job": "lazarus_pit",
|
||||
"heartbeat_interval_seconds": 60,
|
||||
"stale_after_seconds": 180,
|
||||
"required_subdirs": list(DEFAULT_REQUIRED_SUBDIRS),
|
||||
"heartbeat_file": "state/heartbeat.json",
|
||||
}
|
||||
if not config_path.exists():
|
||||
return defaults
|
||||
loaded = json.loads(config_path.read_text())
|
||||
defaults.update(loaded)
|
||||
if not defaults.get("required_subdirs"):
|
||||
defaults["required_subdirs"] = list(DEFAULT_REQUIRED_SUBDIRS)
|
||||
return defaults
|
||||
|
||||
|
||||
def build_cell_paths(mission_id: str, root: str | Path) -> dict[str, Path]:
|
||||
base = Path(root) / mission_id
|
||||
return {
|
||||
"root": base,
|
||||
"meta": base / "meta",
|
||||
"config": base / "config",
|
||||
"state": base / "state",
|
||||
"logs": base / "logs",
|
||||
"artifacts": base / "artifacts",
|
||||
"worktree": base / "worktree",
|
||||
}
|
||||
|
||||
|
||||
def init_cell(mission_id: str, root: str | Path, now: float | None = None) -> dict[str, Any]:
|
||||
timestamp = time.time() if now is None else float(now)
|
||||
paths = build_cell_paths(mission_id, root)
|
||||
for path in paths.values():
|
||||
if path.name != mission_id:
|
||||
path.mkdir(parents=True, exist_ok=True)
|
||||
paths["root"].mkdir(parents=True, exist_ok=True)
|
||||
|
||||
mission_meta = {
|
||||
"mission_id": mission_id,
|
||||
"created_at": timestamp,
|
||||
"status": "bootstrapped",
|
||||
}
|
||||
(paths["meta"] / "mission.json").write_text(json.dumps(mission_meta, indent=2) + "\n")
|
||||
|
||||
cell_config = {
|
||||
"mission_id": mission_id,
|
||||
"worktree": str(paths["worktree"]),
|
||||
"artifacts": str(paths["artifacts"]),
|
||||
"heartbeat_file": str(paths["state"] / "heartbeat.json"),
|
||||
}
|
||||
(paths["config"] / "cell.json").write_text(json.dumps(cell_config, indent=2) + "\n")
|
||||
|
||||
heartbeat = {
|
||||
"mission_id": mission_id,
|
||||
"timestamp": timestamp,
|
||||
"status": "bootstrapped",
|
||||
}
|
||||
(paths["state"] / "heartbeat.json").write_text(json.dumps(heartbeat, indent=2) + "\n")
|
||||
(paths["logs"] / "daemon.log").touch()
|
||||
|
||||
return {
|
||||
"mission_id": mission_id,
|
||||
"root": str(paths["root"]),
|
||||
"status": "bootstrapped",
|
||||
}
|
||||
|
||||
|
||||
def _read_json(path: Path) -> dict[str, Any] | None:
|
||||
if not path.exists():
|
||||
return None
|
||||
try:
|
||||
return json.loads(path.read_text())
|
||||
except json.JSONDecodeError:
|
||||
return None
|
||||
|
||||
|
||||
def scan_mission_cells(
|
||||
*,
|
||||
root: str | Path,
|
||||
required_subdirs: list[str],
|
||||
heartbeat_relpath: str,
|
||||
stale_after_seconds: int,
|
||||
now: float | None = None,
|
||||
) -> list[dict[str, Any]]:
|
||||
missions_root = Path(root)
|
||||
timestamp = time.time() if now is None else float(now)
|
||||
if not missions_root.exists():
|
||||
return []
|
||||
|
||||
cells: list[dict[str, Any]] = []
|
||||
for entry in sorted(missions_root.iterdir()):
|
||||
if not entry.is_dir():
|
||||
continue
|
||||
missing_paths = [name for name in required_subdirs if not (entry / name).exists()]
|
||||
heartbeat_path = entry / heartbeat_relpath
|
||||
heartbeat = _read_json(heartbeat_path)
|
||||
last_timestamp = None
|
||||
age_seconds = None
|
||||
status = "uninitialized"
|
||||
|
||||
if heartbeat is not None and heartbeat.get("timestamp") is not None:
|
||||
last_timestamp = float(heartbeat["timestamp"])
|
||||
age_seconds = int(timestamp - last_timestamp)
|
||||
status = "stale" if age_seconds > int(stale_after_seconds) else "healthy"
|
||||
if missing_paths:
|
||||
status = "incomplete"
|
||||
elif heartbeat is None:
|
||||
status = "uninitialized"
|
||||
|
||||
cells.append(
|
||||
{
|
||||
"mission_id": entry.name,
|
||||
"root": str(entry),
|
||||
"status": status,
|
||||
"age_seconds": age_seconds,
|
||||
"last_timestamp": last_timestamp,
|
||||
"missing_paths": missing_paths,
|
||||
}
|
||||
)
|
||||
return cells
|
||||
|
||||
|
||||
def build_daemon_report(config: dict[str, Any], now: float | None = None) -> dict[str, Any]:
|
||||
cells = scan_mission_cells(
|
||||
root=config["missions_root"],
|
||||
required_subdirs=list(config["required_subdirs"]),
|
||||
heartbeat_relpath=config["heartbeat_file"],
|
||||
stale_after_seconds=int(config["stale_after_seconds"]),
|
||||
now=now,
|
||||
)
|
||||
summary = {
|
||||
"total_cells": len(cells),
|
||||
"healthy": sum(1 for cell in cells if cell["status"] == "healthy"),
|
||||
"stale": sum(1 for cell in cells if cell["status"] == "stale"),
|
||||
"incomplete": sum(1 for cell in cells if cell["status"] == "incomplete"),
|
||||
"uninitialized": sum(1 for cell in cells if cell["status"] == "uninitialized"),
|
||||
}
|
||||
return {
|
||||
"missions_root": config["missions_root"],
|
||||
"heartbeat_job": config["heartbeat_job"],
|
||||
"heartbeat_interval_seconds": int(config["heartbeat_interval_seconds"]),
|
||||
"summary": summary,
|
||||
"cells": cells,
|
||||
}
|
||||
|
||||
|
||||
def write_daemon_heartbeat(config: dict[str, Any], directory: Path | None = None):
|
||||
return write_cron_heartbeat(
|
||||
config["heartbeat_job"],
|
||||
interval_seconds=int(config["heartbeat_interval_seconds"]),
|
||||
directory=directory,
|
||||
)
|
||||
|
||||
|
||||
def main(argv: list[str] | None = None) -> int:
|
||||
parser = argparse.ArgumentParser(description="Lazarus Pit daemon skeleton")
|
||||
parser.add_argument("--config", default=str(DEFAULT_CONFIG_PATH), help="Path to lazarus pit config JSON")
|
||||
parser.add_argument("--root", help="Override missions root directory")
|
||||
parser.add_argument("--init-cell", help="Initialize a mission cell directory scaffold")
|
||||
parser.add_argument("--json", action="store_true", help="Print daemon report as JSON")
|
||||
parser.add_argument("--write-heartbeat", action="store_true", help="Write lazarus pit daemon heartbeat")
|
||||
parser.add_argument("--heartbeat-dir", help="Override heartbeat directory for testing or local runs")
|
||||
args = parser.parse_args(argv)
|
||||
|
||||
config = load_config(args.config)
|
||||
if args.root:
|
||||
config["missions_root"] = args.root
|
||||
|
||||
if args.init_cell:
|
||||
init_cell(args.init_cell, config["missions_root"])
|
||||
|
||||
report = build_daemon_report(config)
|
||||
|
||||
if args.write_heartbeat:
|
||||
hb_dir = Path(args.heartbeat_dir) if args.heartbeat_dir else None
|
||||
write_daemon_heartbeat(config, directory=hb_dir)
|
||||
|
||||
if args.json:
|
||||
print(json.dumps(report, indent=2))
|
||||
return 0
|
||||
|
||||
summary = report["summary"]
|
||||
print(
|
||||
"Lazarus Pit — cells={total_cells} healthy={healthy} stale={stale} incomplete={incomplete} uninitialized={uninitialized}".format(
|
||||
**summary
|
||||
)
|
||||
)
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
raise SystemExit(main())
|
||||
127
tests/test_lazarus_pit.py
Normal file
127
tests/test_lazarus_pit.py
Normal file
@@ -0,0 +1,127 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import importlib.util
|
||||
import json
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
PROJECT_ROOT = Path(__file__).parent.parent
|
||||
|
||||
_spec = importlib.util.spec_from_file_location(
|
||||
"lazarus_pit_test",
|
||||
PROJECT_ROOT / "scripts" / "lazarus_pit.py",
|
||||
)
|
||||
_mod = importlib.util.module_from_spec(_spec)
|
||||
sys.modules["lazarus_pit_test"] = _mod
|
||||
_spec.loader.exec_module(_mod)
|
||||
|
||||
build_cell_paths = _mod.build_cell_paths
|
||||
build_daemon_report = _mod.build_daemon_report
|
||||
init_cell = _mod.init_cell
|
||||
load_config = _mod.load_config
|
||||
scan_mission_cells = _mod.scan_mission_cells
|
||||
write_daemon_heartbeat = _mod.write_daemon_heartbeat
|
||||
|
||||
|
||||
def test_init_cell_creates_foundation_structure(tmp_path):
|
||||
mission_id = "123e4567-e89b-12d3-a456-426614174000"
|
||||
cell = init_cell(mission_id, root=tmp_path, now=1_700_000_000)
|
||||
|
||||
paths = build_cell_paths(mission_id, tmp_path)
|
||||
for key in ["meta", "config", "state", "logs", "artifacts", "worktree"]:
|
||||
assert paths[key].is_dir(), f"expected {key} directory to exist"
|
||||
|
||||
meta = json.loads((paths["meta"] / "mission.json").read_text())
|
||||
assert meta["mission_id"] == mission_id
|
||||
assert meta["status"] == "bootstrapped"
|
||||
|
||||
heartbeat = json.loads((paths["state"] / "heartbeat.json").read_text())
|
||||
assert heartbeat["mission_id"] == mission_id
|
||||
assert heartbeat["status"] == "bootstrapped"
|
||||
assert cell["root"] == str(paths["root"])
|
||||
|
||||
|
||||
def test_scan_mission_cells_marks_healthy_and_stale(tmp_path):
|
||||
healthy_id = "healthy-cell"
|
||||
stale_id = "stale-cell"
|
||||
|
||||
init_cell(healthy_id, root=tmp_path, now=1_700_000_000)
|
||||
init_cell(stale_id, root=tmp_path, now=1_700_000_000)
|
||||
|
||||
healthy_paths = build_cell_paths(healthy_id, tmp_path)
|
||||
stale_paths = build_cell_paths(stale_id, tmp_path)
|
||||
|
||||
(healthy_paths["state"] / "heartbeat.json").write_text(
|
||||
json.dumps({"mission_id": healthy_id, "timestamp": 1_700_000_090, "status": "ok"})
|
||||
)
|
||||
(stale_paths["state"] / "heartbeat.json").write_text(
|
||||
json.dumps({"mission_id": stale_id, "timestamp": 1_700_000_000, "status": "ok"})
|
||||
)
|
||||
|
||||
cells = scan_mission_cells(
|
||||
root=tmp_path,
|
||||
required_subdirs=["meta", "config", "state", "logs", "artifacts", "worktree"],
|
||||
heartbeat_relpath="state/heartbeat.json",
|
||||
stale_after_seconds=60,
|
||||
now=1_700_000_100,
|
||||
)
|
||||
by_id = {cell["mission_id"]: cell for cell in cells}
|
||||
|
||||
assert by_id[healthy_id]["status"] == "healthy"
|
||||
assert by_id[healthy_id]["age_seconds"] == 10
|
||||
assert by_id[stale_id]["status"] == "stale"
|
||||
assert by_id[stale_id]["age_seconds"] == 100
|
||||
|
||||
|
||||
def test_build_daemon_report_and_write_heartbeat(tmp_path):
|
||||
config_path = tmp_path / "lazarus_pit.json"
|
||||
config_path.write_text(
|
||||
json.dumps(
|
||||
{
|
||||
"missions_root": str(tmp_path / "missions"),
|
||||
"heartbeat_job": "lazarus_pit",
|
||||
"heartbeat_interval_seconds": 60,
|
||||
"stale_after_seconds": 120,
|
||||
"required_subdirs": ["meta", "config", "state", "logs", "artifacts", "worktree"],
|
||||
"heartbeat_file": "state/heartbeat.json",
|
||||
}
|
||||
)
|
||||
)
|
||||
|
||||
config = load_config(config_path)
|
||||
init_cell("mission-one", root=Path(config["missions_root"]), now=2_000)
|
||||
paths = build_cell_paths("mission-one", Path(config["missions_root"]))
|
||||
(paths["state"] / "heartbeat.json").write_text(
|
||||
json.dumps({"mission_id": "mission-one", "timestamp": 2_050, "status": "ok"})
|
||||
)
|
||||
|
||||
report = build_daemon_report(config, now=2_100)
|
||||
assert report["summary"]["total_cells"] == 1
|
||||
assert report["summary"]["healthy"] == 1
|
||||
assert report["summary"]["stale"] == 0
|
||||
assert report["cells"][0]["mission_id"] == "mission-one"
|
||||
|
||||
heartbeat_path = write_daemon_heartbeat(config, directory=tmp_path / "heartbeats")
|
||||
heartbeat = json.loads(heartbeat_path.read_text())
|
||||
assert heartbeat["job"] == "lazarus_pit"
|
||||
assert heartbeat["interval_seconds"] == 60
|
||||
|
||||
|
||||
def test_foundation_artifacts_exist_with_required_spec():
|
||||
doc = PROJECT_ROOT / "docs" / "mission-cell-spec.md"
|
||||
config = PROJECT_ROOT / "config" / "lazarus_pit.json"
|
||||
|
||||
assert doc.exists(), "expected mission cell spec doc"
|
||||
assert config.exists(), "expected lazarus pit config"
|
||||
|
||||
content = doc.read_text()
|
||||
for snippet in [
|
||||
"/var/missions/<uuid>/",
|
||||
"meta/mission.json",
|
||||
"config/cell.json",
|
||||
"state/heartbeat.json",
|
||||
"logs/daemon.log",
|
||||
"artifacts/",
|
||||
"worktree/",
|
||||
]:
|
||||
assert snippet in content
|
||||
25
tests/test_night_shift_prediction_report.py
Normal file
25
tests/test_night_shift_prediction_report.py
Normal file
@@ -0,0 +1,25 @@
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
REPORT = Path("reports/night-shift-prediction-2026-04-12.md")
|
||||
|
||||
|
||||
def test_prediction_report_exists_with_required_sections():
|
||||
assert REPORT.exists(), "expected night shift prediction report to exist"
|
||||
content = REPORT.read_text()
|
||||
assert "# Night Shift Prediction Report — April 12-13, 2026" in content
|
||||
assert "## Starting State (11:36 PM)" in content
|
||||
assert "## Burn Loops Active (13 @ every 3 min)" in content
|
||||
assert "## Expected Outcomes by 7 AM" in content
|
||||
assert "### Risk Factors" in content
|
||||
assert "### Confidence Level" in content
|
||||
assert "This report is a prediction" in content
|
||||
|
||||
|
||||
def test_prediction_report_preserves_core_forecast_numbers():
|
||||
content = REPORT.read_text()
|
||||
assert "Total expected API calls: ~2,010" in content
|
||||
assert "Total commits pushed: ~800-1,200" in content
|
||||
assert "Total PRs created: ~150-250" in content
|
||||
assert "the-nexus | 30-50 | 200-300" in content
|
||||
assert "Generated: 2026-04-12 23:36 EDT" in content
|
||||
Reference in New Issue
Block a user