Compare commits

..

1 Commits

Author SHA1 Message Date
Alexander Whitestone
6cbb9a98e1 fix: refresh burn lane audit for #662
Some checks failed
Agent PR Gate / gate (pull_request) Failing after 48s
Self-Healing Smoke / self-healing-smoke (pull_request) Failing after 8s
Smoke Test / smoke (pull_request) Failing after 6s
Agent PR Gate / report (pull_request) Has been cancelled
2026-04-16 23:43:24 -04:00
5 changed files with 151 additions and 213 deletions

View File

@@ -1,58 +0,0 @@
# Issue #582 — Epic Slice Verification
Refs #582
## Status: COMPLETE ON MAIN
The parent-epic orchestration slice for the Know Thy Father multimodal pipeline is already merged and functional on `main`.
## Mainline Evidence
### Core implementation files
- `scripts/know_thy_father/epic_pipeline.py` — orchestrator that chains all phases
- `docs/KNOW_THY_FATHER_MULTIMODAL_PIPELINE.md` — pipeline documentation
- `tests/test_know_thy_father_pipeline.py` — orchestrator tests
### Phase implementation (all on main)
| Phase | Description | Script |
|-------|-------------|--------|
| phase1_media_indexing | Scan archive for media | `scripts/know_thy_father/index_media.py` |
| phase2_multimodal_analysis | Process media entries | `scripts/twitter_archive/analyze_media.py` |
| phase3_holographic_synthesis | Kernel synthesis | `scripts/know_thy_father/synthesize_kernels.py` |
| phase4_cross_reference_audit | Cross-ref audit | `scripts/know_thy_father/crossref_audit.py` |
### Merged PRs (chronological)
- PR #630 — Phase 1: Media indexing
- PR #631 — Phase 2: Multimodal analysis
- PR #637 — Phase 3: Holographic synthesis
- PR #639 — Phase 4: Cross-reference audit
- PR #641 — Orchestrator pipeline integration
- PR #738 — Prior parent-epic slice (closed)
### Test coverage
- `tests/test_know_thy_father_pipeline.py` — orchestrator
- `tests/test_know_thy_father_index.py` — phase 1
- `tests/test_know_thy_father_synthesis.py` — phase 3
- `tests/test_know_thy_father_crossref.py` — phase 4
- `tests/twitter_archive/test_ktf_tracker.py` — tracker
- `tests/twitter_archive/test_analyze_media.py` — phase 2
## Why Refs, Not Closes
The repo-side parent-epic operational slice is fully implemented on `main`. However, the epic (#582) itself remains open because:
1. Full archive consumption is not yet complete
2. Downstream memory integration is pending
3. The orchestrator is functional but not yet exercised against the complete archive
This verification document provides the proof trail that the orchestrator slice is done, preventing duplicate reimplementation.
## Verification Commands
```bash
# Verify this issue's test
python3 -m pytest tests/test_issue_582_verification.py -q
# Verify all Know Thy Father tests pass
python3 -m pytest tests/test_know_thy_father_pipeline.py tests/test_know_thy_father_index.py tests/test_know_thy_father_synthesis.py tests/test_know_thy_father_crossref.py tests/twitter_archive/test_ktf_tracker.py tests/twitter_archive/test_analyze_media.py -q
```

View File

@@ -1,6 +1,6 @@
# Burn Lane Empty Audit — timmy-home #662
Generated: 2026-04-16T01:22:37Z
Generated: 2026-04-17T03:42:50Z
Source issue: `[ops] Burn lane empty — all open issues triaged (2026-04-14)`
## Source Snapshot
@@ -11,9 +11,9 @@ Issue #662 is an operational status note, not a normal feature request. Its body
- Referenced issues audited: 42
- Already closed: 30
- Open but likely closure candidates (merged PR found): 0
- Open with active PRs: 12
- Open / needs manual review: 0
- Open but likely closure candidates (merged PR found): 1
- Open with active PRs: 0
- Open / needs manual review: 11
## Issue Body Drift
@@ -21,56 +21,56 @@ The body of #662 is not current truth. It mixes closed issues, open issues, rang
| Issue | State | Classification | PR Summary |
|---|---|---|---|
| #579 | closed | already closed | closed PR #644, closed PR #640, closed PR #635, closed PR #620 |
| #648 | open | active pr | open PR #731 |
| #579 | closed | already closed | closed PR #644, closed PR #643, closed PR #640, closed PR #635, closed PR #620 |
| #648 | open | needs manual review | closed PR #731 |
| #647 | closed | already closed | issue already closed |
| #619 | closed | already closed | issue already closed |
| #616 | closed | already closed | issue already closed |
| #614 | closed | already closed | issue already closed |
| #613 | closed | already closed | issue already closed |
| #660 | closed | already closed | issue already closed |
| #659 | closed | already closed | issue already closed |
| #659 | closed | already closed | closed PR #660 |
| #658 | closed | already closed | issue already closed |
| #657 | closed | already closed | issue already closed |
| #656 | closed | already closed | closed PR #658 |
| #655 | closed | already closed | issue already closed |
| #654 | closed | already closed | closed PR #661 |
| #653 | closed | already closed | issue already closed |
| #652 | closed | already closed | merged PR #657 |
| #651 | closed | already closed | issue already closed |
| #650 | closed | already closed | merged PR #654 |
| #649 | closed | already closed | issue already closed |
| #646 | closed | already closed | issue already closed |
| #582 | open | active pr | open PR #738 |
| #653 | closed | already closed | closed PR #660, closed PR #655 |
| #652 | closed | already closed | closed PR #660, merged PR #657, closed PR #655 |
| #651 | closed | already closed | closed PR #655 |
| #650 | closed | already closed | closed PR #661, closed PR #660, merged PR #654, closed PR #651 |
| #649 | closed | already closed | closed PR #660, merged PR #657, closed PR #651 |
| #646 | closed | already closed | closed PR #655, closed PR #651 |
| #582 | open | closure candidate | merged PR #641, merged PR #639, merged PR #637, merged PR #631, merged PR #630 |
| #627 | closed | already closed | issue already closed |
| #631 | closed | already closed | issue already closed |
| #632 | closed | already closed | issue already closed |
| #634 | closed | already closed | issue already closed |
| #639 | closed | already closed | issue already closed |
| #641 | closed | already closed | issue already closed |
| #575 | closed | already closed | merged PR #656 |
| #576 | closed | already closed | closed PR #663, closed PR #660, closed PR #655, closed PR #651, closed PR #646, closed PR #642, closed PR #633 |
| #575 | closed | already closed | closed PR #658, merged PR #656 |
| #576 | closed | already closed | merged PR #664, closed PR #663, closed PR #660, closed PR #655, merged PR #654, closed PR #651, closed PR #646, closed PR #642, closed PR #633 |
| #578 | closed | already closed | merged PR #638, closed PR #636 |
| #636 | closed | already closed | issue already closed |
| #638 | closed | already closed | issue already closed |
| #547 | open | active pr | open PR #730 |
| #548 | open | active pr | open PR #712 |
| #549 | open | active pr | open PR #729 |
| #550 | open | active pr | open PR #727 |
| #551 | open | active pr | open PR #725 |
| #552 | open | active pr | open PR #724 |
| #553 | open | active pr | open PR #722 |
| #562 | open | active pr | open PR #718 |
| #544 | open | active pr | open PR #732 |
| #545 | open | active pr | open PR #719 |
| #547 | open | needs manual review | closed PR #730 |
| #548 | open | needs manual review | closed PR #712 |
| #549 | open | needs manual review | closed PR #729 |
| #550 | open | needs manual review | closed PR #727 |
| #551 | open | needs manual review | closed PR #725 |
| #552 | open | needs manual review | closed PR #724 |
| #553 | open | needs manual review | closed PR #722 |
| #562 | open | needs manual review | closed PR #718 |
| #544 | open | needs manual review | closed PR #732 |
| #545 | open | needs manual review | closed PR #719 |
## Closure Candidates
These issues are still open but already have merged PR evidence in the forge and should be reviewed for bulk closure.
| None |
|---|
| None |
| Issue | State | Classification | PR Summary |
|---|---|---|---|
| #582 | open | closure candidate | merged PR #641, merged PR #639, merged PR #637, merged PR #631, merged PR #630 |
## Still Open / Needs Manual Review
@@ -78,18 +78,17 @@ These issues either have no matching PR signal or still have an active PR / ambi
| Issue | State | Classification | PR Summary |
|---|---|---|---|
| #648 | open | active pr | open PR #731 |
| #582 | open | active pr | open PR #738 |
| #547 | open | active pr | open PR #730 |
| #548 | open | active pr | open PR #712 |
| #549 | open | active pr | open PR #729 |
| #550 | open | active pr | open PR #727 |
| #551 | open | active pr | open PR #725 |
| #552 | open | active pr | open PR #724 |
| #553 | open | active pr | open PR #722 |
| #562 | open | active pr | open PR #718 |
| #544 | open | active pr | open PR #732 |
| #545 | open | active pr | open PR #719 |
| #648 | open | needs manual review | closed PR #731 |
| #547 | open | needs manual review | closed PR #730 |
| #548 | open | needs manual review | closed PR #712 |
| #549 | open | needs manual review | closed PR #729 |
| #550 | open | needs manual review | closed PR #727 |
| #551 | open | needs manual review | closed PR #725 |
| #552 | open | needs manual review | closed PR #724 |
| #553 | open | needs manual review | closed PR #722 |
| #562 | open | needs manual review | closed PR #718 |
| #544 | open | needs manual review | closed PR #732 |
| #545 | open | needs manual review | closed PR #719 |
## Recommendation

View File

@@ -23,6 +23,7 @@ class PullSummary:
state: str
merged: bool
head: str
body: str
url: str
@@ -75,7 +76,8 @@ def api_get(path: str, token: str):
def collect_pull_summaries(repo: str, token: str) -> list[PullSummary]:
pulls: list[PullSummary] = []
for state in ("open", "closed"):
for page in range(1, 6):
page = 1
while True:
batch = api_get(f"/repos/{ORG}/{repo}/pulls?state={state}&limit=100&page={page}", token)
if not batch:
break
@@ -87,18 +89,18 @@ def collect_pull_summaries(repo: str, token: str) -> list[PullSummary]:
state=pr.get("state") or state,
merged=bool(pr.get("merged")),
head=(pr.get("head") or {}).get("ref") or "",
body=pr.get("body") or "",
url=pr.get("html_url") or pr.get("url") or "",
)
)
if len(batch) < 100:
break
page += 1
return pulls
def match_prs(issue_num: int, pulls: Iterable[PullSummary]) -> list[PullSummary]:
matches: list[PullSummary] = []
for pr in pulls:
text = f"{pr.title} {pr.head}"
text = f"{pr.title} {pr.head} {pr.body}"
if f"#{issue_num}" in text or pr.head == f"fix/{issue_num}" or f"/{issue_num}" in pr.head or f"-{issue_num}" in pr.head:
matches.append(pr)
return matches
@@ -116,12 +118,16 @@ def classify_issue(issue: dict, related_prs: list[PullSummary]) -> IssueAuditRow
else:
merged = [pr for pr in related_prs if pr.merged]
open_prs = [pr for pr in related_prs if pr.state == "open"]
closed_unmerged = [pr for pr in related_prs if pr.state != "open" and not pr.merged]
if merged:
classification = "closure_candidate"
pr_summary = summarize_prs(merged)
elif open_prs:
classification = "active_pr"
pr_summary = summarize_prs(open_prs)
elif closed_unmerged:
classification = "needs_manual_review"
pr_summary = summarize_prs(closed_unmerged)
else:
classification = "needs_manual_review"
pr_summary = "no matching PR found"

View File

@@ -1,6 +1,13 @@
from pathlib import Path
from scripts.burn_lane_issue_audit import extract_issue_numbers, render_report
from scripts.burn_lane_issue_audit import (
PullSummary,
classify_issue,
collect_pull_summaries,
extract_issue_numbers,
match_prs,
render_report,
)
def test_extract_issue_numbers_handles_ranges_and_literals() -> None:
@@ -14,6 +21,99 @@ def test_extract_issue_numbers_handles_ranges_and_literals() -> None:
assert extract_issue_numbers(body) == [579, 660, 659, 658, 582, 627, 631, 547, 546, 545]
def test_match_prs_detects_issue_ref_in_pr_body() -> None:
pulls = [
PullSummary(
number=731,
title="docs: verify session harvest report",
state="open",
merged=False,
head="fix/session-harvest-report",
body="Refs #648",
url="https://forge.example/pr/731",
),
PullSummary(
number=732,
title="unrelated",
state="open",
merged=False,
head="fix/unrelated",
body="Refs #700",
url="https://forge.example/pr/732",
),
]
assert [pr.number for pr in match_prs(648, pulls)] == [731]
def test_open_issue_with_closed_unmerged_pr_stays_manual_review_with_history() -> None:
issue = {
"number": 648,
"title": "session harvest report",
"state": "open",
"html_url": "https://forge.example/issues/648",
}
row = classify_issue(
issue,
[
PullSummary(
number=731,
title="docs: add session harvest report",
state="closed",
merged=False,
head="fix/648",
body="Closes #648",
url="https://forge.example/pr/731",
)
],
)
assert row.classification == "needs_manual_review"
assert row.pr_summary == "closed PR #731"
def test_collect_pull_summaries_pages_until_empty(monkeypatch) -> None:
def fake_api_get(path: str, token: str):
if "state=open" in path:
return []
page = int(path.split("page=")[1])
if page <= 5:
return [
{
"number": page * 1000 + i,
"title": f"page {page} pr {i}",
"state": "closed",
"merged": False,
"head": {"ref": f"fix/{page}-{i}"},
"body": f"Refs #{page * 1000 + i}",
"html_url": f"https://forge.example/pr/{page * 1000 + i}",
}
for i in range(100)
]
if page == 6:
return [
{
"number": 900,
"title": "late page pr",
"state": "closed",
"merged": False,
"head": {"ref": "fix/900"},
"body": "Refs #900",
"html_url": "https://forge.example/pr/900",
}
]
return []
monkeypatch.setattr("scripts.burn_lane_issue_audit.api_get", fake_api_get)
pulls = collect_pull_summaries("timmy-home", "token")
assert any(pr.number == 900 for pr in pulls)
def test_render_report_calls_out_drift_and_candidates() -> None:
rows = [
{

View File

@@ -1,109 +0,0 @@
"""Verification that the Issue #582 epic orchestration slice exists on main.
Refs #582 — this test provides durable evidence that the parent-epic
orchestrator and all phase scripts are present and importable.
"""
import importlib.util
from pathlib import Path
import unittest
ROOT = Path(__file__).resolve().parent.parent
class TestIssue582Verification(unittest.TestCase):
"""Confirm the Know Thy Father epic slice is on main."""
def _load_module(self, path: Path, name: str):
self.assertTrue(path.exists(), f"missing {path.relative_to(ROOT)}")
spec = importlib.util.spec_from_file_location(name, path)
self.assertIsNotNone(spec)
self.assertIsNotNone(spec.loader)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return module
# --- core orchestrator ---
def test_epic_pipeline_exists(self):
path = ROOT / "scripts" / "know_thy_father" / "epic_pipeline.py"
self.assertTrue(path.exists(), "epic_pipeline.py must exist on main")
def test_epic_pipeline_has_phases(self):
mod = self._load_module(
ROOT / "scripts" / "know_thy_father" / "epic_pipeline.py",
"ktf_epic_pipeline",
)
self.assertTrue(hasattr(mod, "PHASES"), "PHASES list must be defined")
self.assertGreaterEqual(len(mod.PHASES), 4, "at least 4 phases required")
# --- phase scripts ---
def test_phase1_media_indexing_exists(self):
self.assertTrue(
(ROOT / "scripts" / "know_thy_father" / "index_media.py").exists(),
"phase1 script index_media.py must exist",
)
def test_phase2_analyze_media_exists(self):
self.assertTrue(
(ROOT / "scripts" / "twitter_archive" / "analyze_media.py").exists(),
"phase2 script analyze_media.py must exist",
)
def test_phase3_synthesize_kernels_exists(self):
self.assertTrue(
(ROOT / "scripts" / "know_thy_father" / "synthesize_kernels.py").exists(),
"phase3 script synthesize_kernels.py must exist",
)
def test_phase4_crossref_audit_exists(self):
self.assertTrue(
(ROOT / "scripts" / "know_thy_father" / "crossref_audit.py").exists(),
"phase4 script crossref_audit.py must exist",
)
# --- documentation ---
def test_pipeline_docs_exist(self):
self.assertTrue(
(ROOT / "docs" / "KNOW_THY_FATHER_MULTIMODAL_PIPELINE.md").exists(),
"pipeline documentation must exist",
)
def test_verification_doc_exists(self):
self.assertTrue(
(ROOT / "docs" / "issue-582-verification.md").exists(),
"issue-582-verification.md must exist",
)
# --- test coverage ---
def test_pipeline_tests_exist(self):
tests = [
"test_know_thy_father_pipeline.py",
"test_know_thy_father_index.py",
"test_know_thy_father_synthesis.py",
"test_know_thy_father_crossref.py",
]
for name in tests:
self.assertTrue(
(ROOT / "tests" / name).exists(),
f"test file {name} must exist",
)
def test_archive_tests_exist(self):
archive_dir = ROOT / "tests" / "twitter_archive"
self.assertTrue(
(archive_dir / "test_ktf_tracker.py").exists(),
"test_ktf_tracker.py must exist",
)
self.assertTrue(
(archive_dir / "test_analyze_media.py").exists(),
"test_analyze_media.py must exist",
)
if __name__ == "__main__":
unittest.main()