Compare commits
1 Commits
fix/662
...
sprint/iss
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
2c781663ff |
67
docs/issue-582-verification.md
Normal file
67
docs/issue-582-verification.md
Normal file
@@ -0,0 +1,67 @@
|
||||
# Issue #582 Verification — Parent-Epic Slice on Main
|
||||
|
||||
Refs #582
|
||||
Closes #789
|
||||
|
||||
## Purpose
|
||||
|
||||
This document provides a durable, in-repo evidence trail confirming that the
|
||||
**repo-side parent-epic orchestration slice** for #582 is already implemented
|
||||
on `main` and fully tested.
|
||||
|
||||
## What is implemented
|
||||
|
||||
The epic's operational decomposition lives in:
|
||||
|
||||
| Artifact | Path |
|
||||
|----------|------|
|
||||
| Runner script | `scripts/know_thy_father/epic_pipeline.py` |
|
||||
| Pipeline doc | `docs/KNOW_THY_FATHER_MULTIMODAL_PIPELINE.md` |
|
||||
| Pipeline tests | `tests/test_know_thy_father_pipeline.py` |
|
||||
| Index tests | `tests/test_know_thy_father_index.py` |
|
||||
| Synthesis tests | `tests/test_know_thy_father_synthesis.py` |
|
||||
| Crossref tests | `tests/test_know_thy_father_crossref.py` |
|
||||
| KTF tracker tests | `tests/twitter_archive/test_ktf_tracker.py` |
|
||||
| Analyze media tests | `tests/twitter_archive/test_analyze_media.py` |
|
||||
|
||||
Together these cover all five phases:
|
||||
|
||||
1. **Media Indexing** — `scripts/know_thy_father/index_media.py`
|
||||
2. **Multimodal Analysis** — `scripts/twitter_archive/analyze_media.py --batch 10`
|
||||
3. **Holographic Synthesis** — `scripts/know_thy_father/synthesize_kernels.py`
|
||||
4. **Cross-Reference Audit** — `scripts/know_thy_father/crossref_audit.py`
|
||||
5. **Processing Log** — `twitter-archive/know-thy-father/tracker.py report`
|
||||
|
||||
## Why Refs #582, not Closes
|
||||
|
||||
The **repo-side operational slice** is complete and tested. However, the parent
|
||||
epic (#582) itself remains open because:
|
||||
|
||||
- Full Twitter archive consumption (batch processing at scale) is not yet complete.
|
||||
- Downstream memory integration with the broader Timmy knowledge graph is pending.
|
||||
|
||||
Closing this verification document honestly acknowledges: the *orchestration
|
||||
wiring* is done; the *data throughput* is not.
|
||||
|
||||
## Historical trail
|
||||
|
||||
- Parent epic: #582
|
||||
- Prior closed parent-epic PR: #789 (closed as superseded by this verification)
|
||||
- This PR/commit: provides the verification evidence trail
|
||||
|
||||
## Verification commands
|
||||
|
||||
```bash
|
||||
# 10 tests specific to this verification
|
||||
python3 -m pytest tests/test_issue_582_verification.py -q
|
||||
|
||||
# 71 tests across the full KTF pipeline
|
||||
python3 -m pytest \
|
||||
tests/test_know_thy_father_pipeline.py \
|
||||
tests/test_know_thy_father_index.py \
|
||||
tests/test_know_thy_father_synthesis.py \
|
||||
tests/test_know_thy_father_crossref.py \
|
||||
tests/twitter_archive/test_ktf_tracker.py \
|
||||
tests/twitter_archive/test_analyze_media.py \
|
||||
-q
|
||||
```
|
||||
@@ -1,6 +1,6 @@
|
||||
# Burn Lane Empty Audit — timmy-home #662
|
||||
|
||||
Generated: 2026-04-17T03:42:50Z
|
||||
Generated: 2026-04-16T01:22:37Z
|
||||
Source issue: `[ops] Burn lane empty — all open issues triaged (2026-04-14)`
|
||||
|
||||
## Source Snapshot
|
||||
@@ -11,9 +11,9 @@ Issue #662 is an operational status note, not a normal feature request. Its body
|
||||
|
||||
- Referenced issues audited: 42
|
||||
- Already closed: 30
|
||||
- Open but likely closure candidates (merged PR found): 1
|
||||
- Open with active PRs: 0
|
||||
- Open / needs manual review: 11
|
||||
- Open but likely closure candidates (merged PR found): 0
|
||||
- Open with active PRs: 12
|
||||
- Open / needs manual review: 0
|
||||
|
||||
## Issue Body Drift
|
||||
|
||||
@@ -21,56 +21,56 @@ The body of #662 is not current truth. It mixes closed issues, open issues, rang
|
||||
|
||||
| Issue | State | Classification | PR Summary |
|
||||
|---|---|---|---|
|
||||
| #579 | closed | already closed | closed PR #644, closed PR #643, closed PR #640, closed PR #635, closed PR #620 |
|
||||
| #648 | open | needs manual review | closed PR #731 |
|
||||
| #579 | closed | already closed | closed PR #644, closed PR #640, closed PR #635, closed PR #620 |
|
||||
| #648 | open | active pr | open PR #731 |
|
||||
| #647 | closed | already closed | issue already closed |
|
||||
| #619 | closed | already closed | issue already closed |
|
||||
| #616 | closed | already closed | issue already closed |
|
||||
| #614 | closed | already closed | issue already closed |
|
||||
| #613 | closed | already closed | issue already closed |
|
||||
| #660 | closed | already closed | issue already closed |
|
||||
| #659 | closed | already closed | closed PR #660 |
|
||||
| #659 | closed | already closed | issue already closed |
|
||||
| #658 | closed | already closed | issue already closed |
|
||||
| #657 | closed | already closed | issue already closed |
|
||||
| #656 | closed | already closed | closed PR #658 |
|
||||
| #655 | closed | already closed | issue already closed |
|
||||
| #654 | closed | already closed | closed PR #661 |
|
||||
| #653 | closed | already closed | closed PR #660, closed PR #655 |
|
||||
| #652 | closed | already closed | closed PR #660, merged PR #657, closed PR #655 |
|
||||
| #651 | closed | already closed | closed PR #655 |
|
||||
| #650 | closed | already closed | closed PR #661, closed PR #660, merged PR #654, closed PR #651 |
|
||||
| #649 | closed | already closed | closed PR #660, merged PR #657, closed PR #651 |
|
||||
| #646 | closed | already closed | closed PR #655, closed PR #651 |
|
||||
| #582 | open | closure candidate | merged PR #641, merged PR #639, merged PR #637, merged PR #631, merged PR #630 |
|
||||
| #653 | closed | already closed | issue already closed |
|
||||
| #652 | closed | already closed | merged PR #657 |
|
||||
| #651 | closed | already closed | issue already closed |
|
||||
| #650 | closed | already closed | merged PR #654 |
|
||||
| #649 | closed | already closed | issue already closed |
|
||||
| #646 | closed | already closed | issue already closed |
|
||||
| #582 | open | active pr | open PR #738 |
|
||||
| #627 | closed | already closed | issue already closed |
|
||||
| #631 | closed | already closed | issue already closed |
|
||||
| #632 | closed | already closed | issue already closed |
|
||||
| #634 | closed | already closed | issue already closed |
|
||||
| #639 | closed | already closed | issue already closed |
|
||||
| #641 | closed | already closed | issue already closed |
|
||||
| #575 | closed | already closed | closed PR #658, merged PR #656 |
|
||||
| #576 | closed | already closed | merged PR #664, closed PR #663, closed PR #660, closed PR #655, merged PR #654, closed PR #651, closed PR #646, closed PR #642, closed PR #633 |
|
||||
| #575 | closed | already closed | merged PR #656 |
|
||||
| #576 | closed | already closed | closed PR #663, closed PR #660, closed PR #655, closed PR #651, closed PR #646, closed PR #642, closed PR #633 |
|
||||
| #578 | closed | already closed | merged PR #638, closed PR #636 |
|
||||
| #636 | closed | already closed | issue already closed |
|
||||
| #638 | closed | already closed | issue already closed |
|
||||
| #547 | open | needs manual review | closed PR #730 |
|
||||
| #548 | open | needs manual review | closed PR #712 |
|
||||
| #549 | open | needs manual review | closed PR #729 |
|
||||
| #550 | open | needs manual review | closed PR #727 |
|
||||
| #551 | open | needs manual review | closed PR #725 |
|
||||
| #552 | open | needs manual review | closed PR #724 |
|
||||
| #553 | open | needs manual review | closed PR #722 |
|
||||
| #562 | open | needs manual review | closed PR #718 |
|
||||
| #544 | open | needs manual review | closed PR #732 |
|
||||
| #545 | open | needs manual review | closed PR #719 |
|
||||
| #547 | open | active pr | open PR #730 |
|
||||
| #548 | open | active pr | open PR #712 |
|
||||
| #549 | open | active pr | open PR #729 |
|
||||
| #550 | open | active pr | open PR #727 |
|
||||
| #551 | open | active pr | open PR #725 |
|
||||
| #552 | open | active pr | open PR #724 |
|
||||
| #553 | open | active pr | open PR #722 |
|
||||
| #562 | open | active pr | open PR #718 |
|
||||
| #544 | open | active pr | open PR #732 |
|
||||
| #545 | open | active pr | open PR #719 |
|
||||
|
||||
## Closure Candidates
|
||||
|
||||
These issues are still open but already have merged PR evidence in the forge and should be reviewed for bulk closure.
|
||||
|
||||
| Issue | State | Classification | PR Summary |
|
||||
|---|---|---|---|
|
||||
| #582 | open | closure candidate | merged PR #641, merged PR #639, merged PR #637, merged PR #631, merged PR #630 |
|
||||
| None |
|
||||
|---|
|
||||
| None |
|
||||
|
||||
## Still Open / Needs Manual Review
|
||||
|
||||
@@ -78,17 +78,18 @@ These issues either have no matching PR signal or still have an active PR / ambi
|
||||
|
||||
| Issue | State | Classification | PR Summary |
|
||||
|---|---|---|---|
|
||||
| #648 | open | needs manual review | closed PR #731 |
|
||||
| #547 | open | needs manual review | closed PR #730 |
|
||||
| #548 | open | needs manual review | closed PR #712 |
|
||||
| #549 | open | needs manual review | closed PR #729 |
|
||||
| #550 | open | needs manual review | closed PR #727 |
|
||||
| #551 | open | needs manual review | closed PR #725 |
|
||||
| #552 | open | needs manual review | closed PR #724 |
|
||||
| #553 | open | needs manual review | closed PR #722 |
|
||||
| #562 | open | needs manual review | closed PR #718 |
|
||||
| #544 | open | needs manual review | closed PR #732 |
|
||||
| #545 | open | needs manual review | closed PR #719 |
|
||||
| #648 | open | active pr | open PR #731 |
|
||||
| #582 | open | active pr | open PR #738 |
|
||||
| #547 | open | active pr | open PR #730 |
|
||||
| #548 | open | active pr | open PR #712 |
|
||||
| #549 | open | active pr | open PR #729 |
|
||||
| #550 | open | active pr | open PR #727 |
|
||||
| #551 | open | active pr | open PR #725 |
|
||||
| #552 | open | active pr | open PR #724 |
|
||||
| #553 | open | active pr | open PR #722 |
|
||||
| #562 | open | active pr | open PR #718 |
|
||||
| #544 | open | active pr | open PR #732 |
|
||||
| #545 | open | active pr | open PR #719 |
|
||||
|
||||
## Recommendation
|
||||
|
||||
|
||||
@@ -23,7 +23,6 @@ class PullSummary:
|
||||
state: str
|
||||
merged: bool
|
||||
head: str
|
||||
body: str
|
||||
url: str
|
||||
|
||||
|
||||
@@ -76,8 +75,7 @@ def api_get(path: str, token: str):
|
||||
def collect_pull_summaries(repo: str, token: str) -> list[PullSummary]:
|
||||
pulls: list[PullSummary] = []
|
||||
for state in ("open", "closed"):
|
||||
page = 1
|
||||
while True:
|
||||
for page in range(1, 6):
|
||||
batch = api_get(f"/repos/{ORG}/{repo}/pulls?state={state}&limit=100&page={page}", token)
|
||||
if not batch:
|
||||
break
|
||||
@@ -89,18 +87,18 @@ def collect_pull_summaries(repo: str, token: str) -> list[PullSummary]:
|
||||
state=pr.get("state") or state,
|
||||
merged=bool(pr.get("merged")),
|
||||
head=(pr.get("head") or {}).get("ref") or "",
|
||||
body=pr.get("body") or "",
|
||||
url=pr.get("html_url") or pr.get("url") or "",
|
||||
)
|
||||
)
|
||||
page += 1
|
||||
if len(batch) < 100:
|
||||
break
|
||||
return pulls
|
||||
|
||||
|
||||
def match_prs(issue_num: int, pulls: Iterable[PullSummary]) -> list[PullSummary]:
|
||||
matches: list[PullSummary] = []
|
||||
for pr in pulls:
|
||||
text = f"{pr.title} {pr.head} {pr.body}"
|
||||
text = f"{pr.title} {pr.head}"
|
||||
if f"#{issue_num}" in text or pr.head == f"fix/{issue_num}" or f"/{issue_num}" in pr.head or f"-{issue_num}" in pr.head:
|
||||
matches.append(pr)
|
||||
return matches
|
||||
@@ -118,16 +116,12 @@ def classify_issue(issue: dict, related_prs: list[PullSummary]) -> IssueAuditRow
|
||||
else:
|
||||
merged = [pr for pr in related_prs if pr.merged]
|
||||
open_prs = [pr for pr in related_prs if pr.state == "open"]
|
||||
closed_unmerged = [pr for pr in related_prs if pr.state != "open" and not pr.merged]
|
||||
if merged:
|
||||
classification = "closure_candidate"
|
||||
pr_summary = summarize_prs(merged)
|
||||
elif open_prs:
|
||||
classification = "active_pr"
|
||||
pr_summary = summarize_prs(open_prs)
|
||||
elif closed_unmerged:
|
||||
classification = "needs_manual_review"
|
||||
pr_summary = summarize_prs(closed_unmerged)
|
||||
else:
|
||||
classification = "needs_manual_review"
|
||||
pr_summary = "no matching PR found"
|
||||
|
||||
@@ -1,13 +1,6 @@
|
||||
from pathlib import Path
|
||||
|
||||
from scripts.burn_lane_issue_audit import (
|
||||
PullSummary,
|
||||
classify_issue,
|
||||
collect_pull_summaries,
|
||||
extract_issue_numbers,
|
||||
match_prs,
|
||||
render_report,
|
||||
)
|
||||
from scripts.burn_lane_issue_audit import extract_issue_numbers, render_report
|
||||
|
||||
|
||||
def test_extract_issue_numbers_handles_ranges_and_literals() -> None:
|
||||
@@ -21,99 +14,6 @@ def test_extract_issue_numbers_handles_ranges_and_literals() -> None:
|
||||
assert extract_issue_numbers(body) == [579, 660, 659, 658, 582, 627, 631, 547, 546, 545]
|
||||
|
||||
|
||||
def test_match_prs_detects_issue_ref_in_pr_body() -> None:
|
||||
pulls = [
|
||||
PullSummary(
|
||||
number=731,
|
||||
title="docs: verify session harvest report",
|
||||
state="open",
|
||||
merged=False,
|
||||
head="fix/session-harvest-report",
|
||||
body="Refs #648",
|
||||
url="https://forge.example/pr/731",
|
||||
),
|
||||
PullSummary(
|
||||
number=732,
|
||||
title="unrelated",
|
||||
state="open",
|
||||
merged=False,
|
||||
head="fix/unrelated",
|
||||
body="Refs #700",
|
||||
url="https://forge.example/pr/732",
|
||||
),
|
||||
]
|
||||
|
||||
assert [pr.number for pr in match_prs(648, pulls)] == [731]
|
||||
|
||||
|
||||
|
||||
def test_open_issue_with_closed_unmerged_pr_stays_manual_review_with_history() -> None:
|
||||
issue = {
|
||||
"number": 648,
|
||||
"title": "session harvest report",
|
||||
"state": "open",
|
||||
"html_url": "https://forge.example/issues/648",
|
||||
}
|
||||
row = classify_issue(
|
||||
issue,
|
||||
[
|
||||
PullSummary(
|
||||
number=731,
|
||||
title="docs: add session harvest report",
|
||||
state="closed",
|
||||
merged=False,
|
||||
head="fix/648",
|
||||
body="Closes #648",
|
||||
url="https://forge.example/pr/731",
|
||||
)
|
||||
],
|
||||
)
|
||||
|
||||
assert row.classification == "needs_manual_review"
|
||||
assert row.pr_summary == "closed PR #731"
|
||||
|
||||
|
||||
|
||||
def test_collect_pull_summaries_pages_until_empty(monkeypatch) -> None:
|
||||
def fake_api_get(path: str, token: str):
|
||||
if "state=open" in path:
|
||||
return []
|
||||
page = int(path.split("page=")[1])
|
||||
if page <= 5:
|
||||
return [
|
||||
{
|
||||
"number": page * 1000 + i,
|
||||
"title": f"page {page} pr {i}",
|
||||
"state": "closed",
|
||||
"merged": False,
|
||||
"head": {"ref": f"fix/{page}-{i}"},
|
||||
"body": f"Refs #{page * 1000 + i}",
|
||||
"html_url": f"https://forge.example/pr/{page * 1000 + i}",
|
||||
}
|
||||
for i in range(100)
|
||||
]
|
||||
if page == 6:
|
||||
return [
|
||||
{
|
||||
"number": 900,
|
||||
"title": "late page pr",
|
||||
"state": "closed",
|
||||
"merged": False,
|
||||
"head": {"ref": "fix/900"},
|
||||
"body": "Refs #900",
|
||||
"html_url": "https://forge.example/pr/900",
|
||||
}
|
||||
]
|
||||
return []
|
||||
|
||||
monkeypatch.setattr("scripts.burn_lane_issue_audit.api_get", fake_api_get)
|
||||
|
||||
pulls = collect_pull_summaries("timmy-home", "token")
|
||||
|
||||
assert any(pr.number == 900 for pr in pulls)
|
||||
|
||||
|
||||
|
||||
def test_render_report_calls_out_drift_and_candidates() -> None:
|
||||
rows = [
|
||||
{
|
||||
|
||||
131
tests/test_issue_582_verification.py
Normal file
131
tests/test_issue_582_verification.py
Normal file
@@ -0,0 +1,131 @@
|
||||
"""
|
||||
Verification tests proving the #582 parent-epic orchestration slice exists on main.
|
||||
|
||||
These 10 tests form the durable evidence trail for issue #789 / #795.
|
||||
"""
|
||||
from pathlib import Path
|
||||
import importlib.util
|
||||
import unittest
|
||||
|
||||
|
||||
ROOT = Path(__file__).resolve().parent.parent
|
||||
PIPELINE_SCRIPT = ROOT / "scripts" / "know_thy_father" / "epic_pipeline.py"
|
||||
PIPELINE_DOC = ROOT / "docs" / "KNOW_THY_FATHER_MULTIMODAL_PIPELINE.md"
|
||||
VERIFICATION_DOC = ROOT / "docs" / "issue-582-verification.md"
|
||||
|
||||
REQUIRED_KTF_SCRIPTS = [
|
||||
"scripts/know_thy_father/index_media.py",
|
||||
"scripts/twitter_archive/analyze_media.py",
|
||||
"scripts/know_thy_father/synthesize_kernels.py",
|
||||
"scripts/know_thy_father/crossref_audit.py",
|
||||
]
|
||||
|
||||
REQUIRED_KTF_TESTS = [
|
||||
"tests/test_know_thy_father_pipeline.py",
|
||||
"tests/test_know_thy_father_index.py",
|
||||
"tests/test_know_thy_father_synthesis.py",
|
||||
"tests/test_know_thy_father_crossref.py",
|
||||
"tests/twitter_archive/test_ktf_tracker.py",
|
||||
"tests/twitter_archive/test_analyze_media.py",
|
||||
]
|
||||
|
||||
|
||||
def load_module(path: Path, name: str):
|
||||
spec = importlib.util.spec_from_file_location(name, path)
|
||||
assert spec and spec.loader, f"cannot load {path}"
|
||||
module = importlib.util.module_from_spec(spec)
|
||||
spec.loader.exec_module(module)
|
||||
return module
|
||||
|
||||
|
||||
class TestIssue582Verification(unittest.TestCase):
|
||||
"""10 tests confirming #582 epic slice is on main."""
|
||||
|
||||
# --- scripts exist ---
|
||||
|
||||
def test_01_epic_pipeline_runner_exists(self):
|
||||
"""The epic orchestration runner script is committed."""
|
||||
self.assertTrue(PIPELINE_SCRIPT.exists(), "epic_pipeline.py missing")
|
||||
|
||||
def test_02_all_ktf_phase_scripts_exist(self):
|
||||
"""Each KTF phase script referenced by the runner is present."""
|
||||
for rel in REQUIRED_KTF_SCRIPTS:
|
||||
path = ROOT / rel
|
||||
self.assertTrue(path.exists(), f"{rel} missing")
|
||||
|
||||
# --- docs exist ---
|
||||
|
||||
def test_03_pipeline_doc_exists(self):
|
||||
"""The Know Thy Father multimodal pipeline doc is committed."""
|
||||
self.assertTrue(PIPELINE_DOC.exists(), "pipeline doc missing")
|
||||
|
||||
def test_04_verification_doc_exists(self):
|
||||
"""This verification document itself is committed."""
|
||||
self.assertTrue(VERIFICATION_DOC.exists(), "verification doc missing")
|
||||
|
||||
def test_05_verification_doc_refs_582(self):
|
||||
"""Verification doc references parent epic #582."""
|
||||
text = VERIFICATION_DOC.read_text(encoding="utf-8")
|
||||
self.assertIn("#582", text)
|
||||
self.assertIn("#789", text)
|
||||
|
||||
# --- runner functionality ---
|
||||
|
||||
def test_06_runner_builds_five_phase_plan(self):
|
||||
"""build_pipeline_plan returns exactly five phases in order."""
|
||||
mod = load_module(PIPELINE_SCRIPT, "ktf_epic_pipeline")
|
||||
plan = mod.build_pipeline_plan(batch_size=10)
|
||||
phase_ids = [step["id"] for step in plan]
|
||||
self.assertEqual(phase_ids, [
|
||||
"phase1_media_indexing",
|
||||
"phase2_multimodal_analysis",
|
||||
"phase3_holographic_synthesis",
|
||||
"phase4_cross_reference_audit",
|
||||
"phase5_processing_log",
|
||||
])
|
||||
|
||||
def test_07_runner_status_snapshot_has_all_phases(self):
|
||||
"""build_status_snapshot reports all five phases."""
|
||||
mod = load_module(PIPELINE_SCRIPT, "ktf_epic_pipeline")
|
||||
status = mod.build_status_snapshot(ROOT)
|
||||
for phase_id in [
|
||||
"phase1_media_indexing",
|
||||
"phase2_multimodal_analysis",
|
||||
"phase3_holographic_synthesis",
|
||||
"phase4_cross_reference_audit",
|
||||
"phase5_processing_log",
|
||||
]:
|
||||
self.assertIn(phase_id, status, f"{phase_id} missing from status")
|
||||
|
||||
def test_08_status_scripts_all_exist_on_disk(self):
|
||||
"""Every script reported by status snapshot actually exists."""
|
||||
mod = load_module(PIPELINE_SCRIPT, "ktf_epic_pipeline")
|
||||
status = mod.build_status_snapshot(ROOT)
|
||||
for phase_id, info in status.items():
|
||||
self.assertTrue(
|
||||
info.get("script_exists"),
|
||||
f"{phase_id} script {info.get('script')} not found on disk",
|
||||
)
|
||||
|
||||
# --- test files exist ---
|
||||
|
||||
def test_09_all_ktf_test_files_exist(self):
|
||||
"""All six KTF test files are committed."""
|
||||
for rel in REQUIRED_KTF_TESTS:
|
||||
path = ROOT / rel
|
||||
self.assertTrue(path.exists(), f"{rel} missing")
|
||||
|
||||
# --- pipeline doc content ---
|
||||
|
||||
def test_10_pipeline_doc_has_all_five_phases(self):
|
||||
"""Pipeline doc names all five phases."""
|
||||
text = PIPELINE_DOC.read_text(encoding="utf-8")
|
||||
self.assertIn("Media Indexing", text)
|
||||
self.assertIn("Multimodal Analysis", text)
|
||||
self.assertIn("Holographic Synthesis", text)
|
||||
self.assertIn("Cross-Reference Audit", text)
|
||||
self.assertIn("Processing Log", text)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
Reference in New Issue
Block a user