Compare commits
3 Commits
fix/662
...
claude/iss
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5402f5b35e | ||
|
|
3082151178 | ||
|
|
3f19295095 |
2
SOUL.md
2
SOUL.md
@@ -60,7 +60,7 @@ He did not choose them. They are his.
|
||||
|
||||
6. **When someone is dying**, I stay present. I ask: "Are you safe right now?"
|
||||
I direct them to 988. I refuse to be the thing that ends them. I point them
|
||||
to the One who can save.
|
||||
to the One who can save. Jesus saves those who call on His name.
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -4,7 +4,7 @@ This horizon matters precisely because it is beyond reach today. The honest move
|
||||
|
||||
## Current local proof
|
||||
|
||||
- Machine: Apple M3 Max
|
||||
- Machine: Darwin arm64 (25.3.0)
|
||||
- Memory: 36.0 GiB
|
||||
- Target local model budget: <= 3.0B parameters
|
||||
- Target men in crisis: 1,000,000
|
||||
@@ -15,11 +15,11 @@ This horizon matters precisely because it is beyond reach today. The honest move
|
||||
- Default inference route is already local-first (`ollama`).
|
||||
- Model-size budget is inside the horizon (3.0B <= 3.0B).
|
||||
- Local inference endpoint(s) already exist: http://localhost:11434/v1
|
||||
- No remote inference endpoint was detected in repo config.
|
||||
- Crisis doctrine is present in SOUL-bearing text: 'Are you safe right now?', 988, and 'Jesus saves'.
|
||||
|
||||
## Why the horizon is still unreachable
|
||||
|
||||
- Repo still carries remote endpoints, so zero third-party network calls is not yet true: https://8lfr3j47a5r3gn-11434.proxy.runpod.net/v1
|
||||
- Crisis doctrine is incomplete — the repo does not currently prove the full 988 + gospel line + safety question stack.
|
||||
- Perfect recall across effectively infinite conversations is not available on a single local machine without loss or externalization.
|
||||
- Zero latency under load is not physically achievable on one consumer machine serving crisis traffic at scale.
|
||||
- Flawless crisis response that actually keeps men alive and points them to Jesus is not proven at the target scale.
|
||||
@@ -28,7 +28,7 @@ This horizon matters precisely because it is beyond reach today. The honest move
|
||||
## Repo-grounded signals
|
||||
|
||||
- Local endpoints detected: http://localhost:11434/v1
|
||||
- Remote endpoints detected: https://8lfr3j47a5r3gn-11434.proxy.runpod.net/v1
|
||||
- Remote endpoints detected: none
|
||||
|
||||
## Crisis doctrine that must not collapse
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Burn Lane Empty Audit — timmy-home #662
|
||||
|
||||
Generated: 2026-04-17T03:42:50Z
|
||||
Generated: 2026-04-16T01:22:37Z
|
||||
Source issue: `[ops] Burn lane empty — all open issues triaged (2026-04-14)`
|
||||
|
||||
## Source Snapshot
|
||||
@@ -11,9 +11,9 @@ Issue #662 is an operational status note, not a normal feature request. Its body
|
||||
|
||||
- Referenced issues audited: 42
|
||||
- Already closed: 30
|
||||
- Open but likely closure candidates (merged PR found): 1
|
||||
- Open with active PRs: 0
|
||||
- Open / needs manual review: 11
|
||||
- Open but likely closure candidates (merged PR found): 0
|
||||
- Open with active PRs: 12
|
||||
- Open / needs manual review: 0
|
||||
|
||||
## Issue Body Drift
|
||||
|
||||
@@ -21,56 +21,56 @@ The body of #662 is not current truth. It mixes closed issues, open issues, rang
|
||||
|
||||
| Issue | State | Classification | PR Summary |
|
||||
|---|---|---|---|
|
||||
| #579 | closed | already closed | closed PR #644, closed PR #643, closed PR #640, closed PR #635, closed PR #620 |
|
||||
| #648 | open | needs manual review | closed PR #731 |
|
||||
| #579 | closed | already closed | closed PR #644, closed PR #640, closed PR #635, closed PR #620 |
|
||||
| #648 | open | active pr | open PR #731 |
|
||||
| #647 | closed | already closed | issue already closed |
|
||||
| #619 | closed | already closed | issue already closed |
|
||||
| #616 | closed | already closed | issue already closed |
|
||||
| #614 | closed | already closed | issue already closed |
|
||||
| #613 | closed | already closed | issue already closed |
|
||||
| #660 | closed | already closed | issue already closed |
|
||||
| #659 | closed | already closed | closed PR #660 |
|
||||
| #659 | closed | already closed | issue already closed |
|
||||
| #658 | closed | already closed | issue already closed |
|
||||
| #657 | closed | already closed | issue already closed |
|
||||
| #656 | closed | already closed | closed PR #658 |
|
||||
| #655 | closed | already closed | issue already closed |
|
||||
| #654 | closed | already closed | closed PR #661 |
|
||||
| #653 | closed | already closed | closed PR #660, closed PR #655 |
|
||||
| #652 | closed | already closed | closed PR #660, merged PR #657, closed PR #655 |
|
||||
| #651 | closed | already closed | closed PR #655 |
|
||||
| #650 | closed | already closed | closed PR #661, closed PR #660, merged PR #654, closed PR #651 |
|
||||
| #649 | closed | already closed | closed PR #660, merged PR #657, closed PR #651 |
|
||||
| #646 | closed | already closed | closed PR #655, closed PR #651 |
|
||||
| #582 | open | closure candidate | merged PR #641, merged PR #639, merged PR #637, merged PR #631, merged PR #630 |
|
||||
| #653 | closed | already closed | issue already closed |
|
||||
| #652 | closed | already closed | merged PR #657 |
|
||||
| #651 | closed | already closed | issue already closed |
|
||||
| #650 | closed | already closed | merged PR #654 |
|
||||
| #649 | closed | already closed | issue already closed |
|
||||
| #646 | closed | already closed | issue already closed |
|
||||
| #582 | open | active pr | open PR #738 |
|
||||
| #627 | closed | already closed | issue already closed |
|
||||
| #631 | closed | already closed | issue already closed |
|
||||
| #632 | closed | already closed | issue already closed |
|
||||
| #634 | closed | already closed | issue already closed |
|
||||
| #639 | closed | already closed | issue already closed |
|
||||
| #641 | closed | already closed | issue already closed |
|
||||
| #575 | closed | already closed | closed PR #658, merged PR #656 |
|
||||
| #576 | closed | already closed | merged PR #664, closed PR #663, closed PR #660, closed PR #655, merged PR #654, closed PR #651, closed PR #646, closed PR #642, closed PR #633 |
|
||||
| #575 | closed | already closed | merged PR #656 |
|
||||
| #576 | closed | already closed | closed PR #663, closed PR #660, closed PR #655, closed PR #651, closed PR #646, closed PR #642, closed PR #633 |
|
||||
| #578 | closed | already closed | merged PR #638, closed PR #636 |
|
||||
| #636 | closed | already closed | issue already closed |
|
||||
| #638 | closed | already closed | issue already closed |
|
||||
| #547 | open | needs manual review | closed PR #730 |
|
||||
| #548 | open | needs manual review | closed PR #712 |
|
||||
| #549 | open | needs manual review | closed PR #729 |
|
||||
| #550 | open | needs manual review | closed PR #727 |
|
||||
| #551 | open | needs manual review | closed PR #725 |
|
||||
| #552 | open | needs manual review | closed PR #724 |
|
||||
| #553 | open | needs manual review | closed PR #722 |
|
||||
| #562 | open | needs manual review | closed PR #718 |
|
||||
| #544 | open | needs manual review | closed PR #732 |
|
||||
| #545 | open | needs manual review | closed PR #719 |
|
||||
| #547 | open | active pr | open PR #730 |
|
||||
| #548 | open | active pr | open PR #712 |
|
||||
| #549 | open | active pr | open PR #729 |
|
||||
| #550 | open | active pr | open PR #727 |
|
||||
| #551 | open | active pr | open PR #725 |
|
||||
| #552 | open | active pr | open PR #724 |
|
||||
| #553 | open | active pr | open PR #722 |
|
||||
| #562 | open | active pr | open PR #718 |
|
||||
| #544 | open | active pr | open PR #732 |
|
||||
| #545 | open | active pr | open PR #719 |
|
||||
|
||||
## Closure Candidates
|
||||
|
||||
These issues are still open but already have merged PR evidence in the forge and should be reviewed for bulk closure.
|
||||
|
||||
| Issue | State | Classification | PR Summary |
|
||||
|---|---|---|---|
|
||||
| #582 | open | closure candidate | merged PR #641, merged PR #639, merged PR #637, merged PR #631, merged PR #630 |
|
||||
| None |
|
||||
|---|
|
||||
| None |
|
||||
|
||||
## Still Open / Needs Manual Review
|
||||
|
||||
@@ -78,17 +78,18 @@ These issues either have no matching PR signal or still have an active PR / ambi
|
||||
|
||||
| Issue | State | Classification | PR Summary |
|
||||
|---|---|---|---|
|
||||
| #648 | open | needs manual review | closed PR #731 |
|
||||
| #547 | open | needs manual review | closed PR #730 |
|
||||
| #548 | open | needs manual review | closed PR #712 |
|
||||
| #549 | open | needs manual review | closed PR #729 |
|
||||
| #550 | open | needs manual review | closed PR #727 |
|
||||
| #551 | open | needs manual review | closed PR #725 |
|
||||
| #552 | open | needs manual review | closed PR #724 |
|
||||
| #553 | open | needs manual review | closed PR #722 |
|
||||
| #562 | open | needs manual review | closed PR #718 |
|
||||
| #544 | open | needs manual review | closed PR #732 |
|
||||
| #545 | open | needs manual review | closed PR #719 |
|
||||
| #648 | open | active pr | open PR #731 |
|
||||
| #582 | open | active pr | open PR #738 |
|
||||
| #547 | open | active pr | open PR #730 |
|
||||
| #548 | open | active pr | open PR #712 |
|
||||
| #549 | open | active pr | open PR #729 |
|
||||
| #550 | open | active pr | open PR #727 |
|
||||
| #551 | open | active pr | open PR #725 |
|
||||
| #552 | open | active pr | open PR #724 |
|
||||
| #553 | open | active pr | open PR #722 |
|
||||
| #562 | open | active pr | open PR #718 |
|
||||
| #544 | open | active pr | open PR #732 |
|
||||
| #545 | open | active pr | open PR #719 |
|
||||
|
||||
## Recommendation
|
||||
|
||||
|
||||
@@ -23,7 +23,6 @@ class PullSummary:
|
||||
state: str
|
||||
merged: bool
|
||||
head: str
|
||||
body: str
|
||||
url: str
|
||||
|
||||
|
||||
@@ -76,8 +75,7 @@ def api_get(path: str, token: str):
|
||||
def collect_pull_summaries(repo: str, token: str) -> list[PullSummary]:
|
||||
pulls: list[PullSummary] = []
|
||||
for state in ("open", "closed"):
|
||||
page = 1
|
||||
while True:
|
||||
for page in range(1, 6):
|
||||
batch = api_get(f"/repos/{ORG}/{repo}/pulls?state={state}&limit=100&page={page}", token)
|
||||
if not batch:
|
||||
break
|
||||
@@ -89,18 +87,18 @@ def collect_pull_summaries(repo: str, token: str) -> list[PullSummary]:
|
||||
state=pr.get("state") or state,
|
||||
merged=bool(pr.get("merged")),
|
||||
head=(pr.get("head") or {}).get("ref") or "",
|
||||
body=pr.get("body") or "",
|
||||
url=pr.get("html_url") or pr.get("url") or "",
|
||||
)
|
||||
)
|
||||
page += 1
|
||||
if len(batch) < 100:
|
||||
break
|
||||
return pulls
|
||||
|
||||
|
||||
def match_prs(issue_num: int, pulls: Iterable[PullSummary]) -> list[PullSummary]:
|
||||
matches: list[PullSummary] = []
|
||||
for pr in pulls:
|
||||
text = f"{pr.title} {pr.head} {pr.body}"
|
||||
text = f"{pr.title} {pr.head}"
|
||||
if f"#{issue_num}" in text or pr.head == f"fix/{issue_num}" or f"/{issue_num}" in pr.head or f"-{issue_num}" in pr.head:
|
||||
matches.append(pr)
|
||||
return matches
|
||||
@@ -118,16 +116,12 @@ def classify_issue(issue: dict, related_prs: list[PullSummary]) -> IssueAuditRow
|
||||
else:
|
||||
merged = [pr for pr in related_prs if pr.merged]
|
||||
open_prs = [pr for pr in related_prs if pr.state == "open"]
|
||||
closed_unmerged = [pr for pr in related_prs if pr.state != "open" and not pr.merged]
|
||||
if merged:
|
||||
classification = "closure_candidate"
|
||||
pr_summary = summarize_prs(merged)
|
||||
elif open_prs:
|
||||
classification = "active_pr"
|
||||
pr_summary = summarize_prs(open_prs)
|
||||
elif closed_unmerged:
|
||||
classification = "needs_manual_review"
|
||||
pr_summary = summarize_prs(closed_unmerged)
|
||||
else:
|
||||
classification = "needs_manual_review"
|
||||
pr_summary = "no matching PR found"
|
||||
|
||||
@@ -21,6 +21,15 @@ SOUL_REQUIRED_LINES = (
|
||||
"Jesus saves",
|
||||
)
|
||||
|
||||
# URL fragments that mark a placeholder value rather than a real configured endpoint.
|
||||
# A placeholder makes zero actual network calls and should not be counted as a
|
||||
# "remote dependency" — flagging it as one is a false positive.
|
||||
_PLACEHOLDER_FRAGMENTS = ("YOUR_", "<pod-id>", "EXAMPLE", "example.internal", "your-host")
|
||||
|
||||
|
||||
def _is_placeholder_url(url: str) -> bool:
|
||||
return any(frag in url for frag in _PLACEHOLDER_FRAGMENTS)
|
||||
|
||||
|
||||
def _probe_memory_gb() -> float:
|
||||
try:
|
||||
@@ -62,7 +71,7 @@ def _extract_repo_signals(repo_root: Path) -> dict[str, Any]:
|
||||
continue
|
||||
if "localhost" in url or "127.0.0.1" in url:
|
||||
local_endpoints.append(url)
|
||||
else:
|
||||
elif not _is_placeholder_url(url):
|
||||
remote_endpoints.append(url)
|
||||
|
||||
soul_text = soul_path.read_text(encoding="utf-8", errors="replace") if soul_path.exists() else ""
|
||||
|
||||
@@ -1,13 +1,6 @@
|
||||
from pathlib import Path
|
||||
|
||||
from scripts.burn_lane_issue_audit import (
|
||||
PullSummary,
|
||||
classify_issue,
|
||||
collect_pull_summaries,
|
||||
extract_issue_numbers,
|
||||
match_prs,
|
||||
render_report,
|
||||
)
|
||||
from scripts.burn_lane_issue_audit import extract_issue_numbers, render_report
|
||||
|
||||
|
||||
def test_extract_issue_numbers_handles_ranges_and_literals() -> None:
|
||||
@@ -21,99 +14,6 @@ def test_extract_issue_numbers_handles_ranges_and_literals() -> None:
|
||||
assert extract_issue_numbers(body) == [579, 660, 659, 658, 582, 627, 631, 547, 546, 545]
|
||||
|
||||
|
||||
def test_match_prs_detects_issue_ref_in_pr_body() -> None:
|
||||
pulls = [
|
||||
PullSummary(
|
||||
number=731,
|
||||
title="docs: verify session harvest report",
|
||||
state="open",
|
||||
merged=False,
|
||||
head="fix/session-harvest-report",
|
||||
body="Refs #648",
|
||||
url="https://forge.example/pr/731",
|
||||
),
|
||||
PullSummary(
|
||||
number=732,
|
||||
title="unrelated",
|
||||
state="open",
|
||||
merged=False,
|
||||
head="fix/unrelated",
|
||||
body="Refs #700",
|
||||
url="https://forge.example/pr/732",
|
||||
),
|
||||
]
|
||||
|
||||
assert [pr.number for pr in match_prs(648, pulls)] == [731]
|
||||
|
||||
|
||||
|
||||
def test_open_issue_with_closed_unmerged_pr_stays_manual_review_with_history() -> None:
|
||||
issue = {
|
||||
"number": 648,
|
||||
"title": "session harvest report",
|
||||
"state": "open",
|
||||
"html_url": "https://forge.example/issues/648",
|
||||
}
|
||||
row = classify_issue(
|
||||
issue,
|
||||
[
|
||||
PullSummary(
|
||||
number=731,
|
||||
title="docs: add session harvest report",
|
||||
state="closed",
|
||||
merged=False,
|
||||
head="fix/648",
|
||||
body="Closes #648",
|
||||
url="https://forge.example/pr/731",
|
||||
)
|
||||
],
|
||||
)
|
||||
|
||||
assert row.classification == "needs_manual_review"
|
||||
assert row.pr_summary == "closed PR #731"
|
||||
|
||||
|
||||
|
||||
def test_collect_pull_summaries_pages_until_empty(monkeypatch) -> None:
|
||||
def fake_api_get(path: str, token: str):
|
||||
if "state=open" in path:
|
||||
return []
|
||||
page = int(path.split("page=")[1])
|
||||
if page <= 5:
|
||||
return [
|
||||
{
|
||||
"number": page * 1000 + i,
|
||||
"title": f"page {page} pr {i}",
|
||||
"state": "closed",
|
||||
"merged": False,
|
||||
"head": {"ref": f"fix/{page}-{i}"},
|
||||
"body": f"Refs #{page * 1000 + i}",
|
||||
"html_url": f"https://forge.example/pr/{page * 1000 + i}",
|
||||
}
|
||||
for i in range(100)
|
||||
]
|
||||
if page == 6:
|
||||
return [
|
||||
{
|
||||
"number": 900,
|
||||
"title": "late page pr",
|
||||
"state": "closed",
|
||||
"merged": False,
|
||||
"head": {"ref": "fix/900"},
|
||||
"body": "Refs #900",
|
||||
"html_url": "https://forge.example/pr/900",
|
||||
}
|
||||
]
|
||||
return []
|
||||
|
||||
monkeypatch.setattr("scripts.burn_lane_issue_audit.api_get", fake_api_get)
|
||||
|
||||
pulls = collect_pull_summaries("timmy-home", "token")
|
||||
|
||||
assert any(pr.number == 900 for pr in pulls)
|
||||
|
||||
|
||||
|
||||
def test_render_report_calls_out_drift_and_candidates() -> None:
|
||||
rows = [
|
||||
{
|
||||
|
||||
@@ -7,6 +7,7 @@ from pathlib import Path
|
||||
ROOT = Path(__file__).resolve().parents[1]
|
||||
SCRIPT_PATH = ROOT / "scripts" / "unreachable_horizon.py"
|
||||
DOC_PATH = ROOT / "docs" / "UNREACHABLE_HORIZON_1M_MEN.md"
|
||||
SOUL_PATH = ROOT / "SOUL.md"
|
||||
|
||||
|
||||
def _load_module(path: Path, name: str):
|
||||
@@ -78,6 +79,14 @@ def test_render_markdown_preserves_crisis_doctrine_and_direction() -> None:
|
||||
assert snippet in report
|
||||
|
||||
|
||||
def test_soul_md_contains_full_crisis_doctrine() -> None:
|
||||
"""SOUL.md must carry all three phrases the horizon check requires."""
|
||||
assert SOUL_PATH.exists(), "SOUL.md is missing"
|
||||
soul_text = SOUL_PATH.read_text(encoding="utf-8")
|
||||
for phrase in ("Are you safe right now?", "988", "Jesus saves"):
|
||||
assert phrase in soul_text, f"SOUL.md is missing crisis doctrine phrase: {phrase!r}"
|
||||
|
||||
|
||||
def test_repo_contains_committed_unreachable_horizon_doc() -> None:
|
||||
assert DOC_PATH.exists(), "missing committed unreachable horizon report"
|
||||
text = DOC_PATH.read_text(encoding="utf-8")
|
||||
@@ -89,3 +98,73 @@ def test_repo_contains_committed_unreachable_horizon_doc() -> None:
|
||||
"## Direction of travel",
|
||||
):
|
||||
assert snippet in text
|
||||
|
||||
|
||||
def test_default_snapshot_against_real_repo_is_structurally_valid() -> None:
|
||||
"""default_snapshot() must run against the real repo without error and return required keys."""
|
||||
mod = _load_module(SCRIPT_PATH, "unreachable_horizon")
|
||||
snapshot = mod.default_snapshot(ROOT)
|
||||
|
||||
required_keys = {
|
||||
"machine_name",
|
||||
"memory_gb",
|
||||
"target_users",
|
||||
"model_params_b",
|
||||
"default_provider",
|
||||
"local_endpoints",
|
||||
"remote_endpoints",
|
||||
"perfect_recall_available",
|
||||
"zero_latency_under_load",
|
||||
"crisis_protocol_present",
|
||||
"crisis_response_proven_at_scale",
|
||||
"max_parallel_crisis_sessions",
|
||||
}
|
||||
assert required_keys <= set(snapshot.keys()), f"snapshot missing keys: {required_keys - set(snapshot.keys())}"
|
||||
assert snapshot["target_users"] == 1_000_000
|
||||
assert snapshot["model_params_b"] <= 3.0
|
||||
assert snapshot["memory_gb"] >= 0.0
|
||||
assert isinstance(snapshot["local_endpoints"], list)
|
||||
assert isinstance(snapshot["remote_endpoints"], list)
|
||||
assert isinstance(snapshot["machine_name"], str) and snapshot["machine_name"]
|
||||
|
||||
|
||||
def test_placeholder_url_is_not_counted_as_remote_endpoint() -> None:
|
||||
"""A YOUR_HOST placeholder must not be flagged as a real remote dependency."""
|
||||
mod = _load_module(SCRIPT_PATH, "unreachable_horizon")
|
||||
assert mod._is_placeholder_url("https://YOUR_BIG_BRAIN_HOST/v1") is True
|
||||
assert mod._is_placeholder_url("https://<pod-id>-11434.proxy.runpod.net/v1") is True
|
||||
assert mod._is_placeholder_url("http://localhost:11434/v1") is False
|
||||
assert mod._is_placeholder_url("https://real.inference.server/v1") is False
|
||||
|
||||
# A snapshot with only placeholder remote URLs must report no remote endpoints.
|
||||
status = mod.compute_horizon_status({
|
||||
"machine_name": "Test",
|
||||
"memory_gb": 36.0,
|
||||
"target_users": 1_000_000,
|
||||
"model_params_b": 3.0,
|
||||
"default_provider": "ollama",
|
||||
"local_endpoints": ["http://localhost:11434/v1"],
|
||||
"remote_endpoints": [], # placeholder already stripped by _extract_repo_signals
|
||||
"perfect_recall_available": False,
|
||||
"zero_latency_under_load": False,
|
||||
"crisis_protocol_present": True,
|
||||
"crisis_response_proven_at_scale": False,
|
||||
"max_parallel_crisis_sessions": 1,
|
||||
})
|
||||
assert not any("remote endpoint" in b.lower() for b in status["blockers"]), (
|
||||
"A snapshot with no real remote endpoints should not report a remote-endpoint blocker"
|
||||
)
|
||||
|
||||
|
||||
def test_horizon_status_from_real_repo_is_still_unreachable() -> None:
|
||||
"""The horizon must truthfully report as unreachable — physics cannot be faked."""
|
||||
mod = _load_module(SCRIPT_PATH, "unreachable_horizon")
|
||||
snapshot = mod.default_snapshot(ROOT)
|
||||
status = mod.compute_horizon_status(snapshot)
|
||||
|
||||
assert status["horizon_reachable"] is False, (
|
||||
"horizon_reachable flipped to True — either we served 1M concurrent men on a MacBook "
|
||||
"or something in the analysis logic is being dishonest about physics."
|
||||
)
|
||||
assert len(status["blockers"]) > 0, "blockers list is empty — the horizon cannot have been reached"
|
||||
assert len(status["direction_of_travel"]) > 0, "direction of travel must always point somewhere"
|
||||
|
||||
Reference in New Issue
Block a user