Compare commits
1 Commits
step35/683
...
sprint/iss
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
52c869ae43 |
73
docs/issue-582-verification.md
Normal file
73
docs/issue-582-verification.md
Normal file
@@ -0,0 +1,73 @@
|
||||
# Issue #582 Verification — Parent-Epic Orchestration Slice
|
||||
|
||||
**Date:** 2026-04-20
|
||||
**Status:** Slice already present on `main`; epic remains open for full archive consumption.
|
||||
|
||||
## What #582 asked for
|
||||
|
||||
A single orchestration script that stitches the five Know Thy Father phases together
|
||||
into one reviewable plan — not a replacement for individual scripts, but a spine
|
||||
that future passes can run, resume, and verify.
|
||||
|
||||
## What exists on `main`
|
||||
|
||||
| Artifact | Path | Present |
|
||||
|----------|------|---------|
|
||||
| Epic pipeline runner | `scripts/know_thy_father/epic_pipeline.py` | ✅ |
|
||||
| Pipeline documentation | `docs/KNOW_THY_FATHER_MULTIMODAL_PIPELINE.md` | ✅ |
|
||||
| Phase 1 — Media Indexing | `scripts/know_thy_father/index_media.py` | ✅ |
|
||||
| Phase 2 — Multimodal Analysis | `scripts/twitter_archive/analyze_media.py` | ✅ |
|
||||
| Phase 3 — Holographic Synthesis | `scripts/know_thy_father/synthesize_kernels.py` | ✅ |
|
||||
| Phase 4 — Cross-Reference Audit | `scripts/know_thy_father/crossref_audit.py` | ✅ |
|
||||
| Phase 5 — Processing Log | `twitter-archive/know-thy-father/tracker.py` | ✅ |
|
||||
|
||||
## Runner capabilities (all implemented)
|
||||
|
||||
```bash
|
||||
# Print the orchestrated plan
|
||||
python3 scripts/know_thy_father/epic_pipeline.py
|
||||
|
||||
# JSON status snapshot of scripts + known artifact paths
|
||||
python3 scripts/know_thy_father/epic_pipeline.py --status --json
|
||||
|
||||
# Execute one concrete step
|
||||
python3 scripts/know_thy_father/epic_pipeline.py --run-step phase2_multimodal_analysis --batch-size 10
|
||||
```
|
||||
|
||||
## Test coverage
|
||||
|
||||
The following test suites confirm the orchestration slice is intact:
|
||||
|
||||
- `tests/test_know_thy_father_pipeline.py` — pipeline plan structure, status snapshot, doc presence
|
||||
- `tests/test_know_thy_father_index.py` — Phase 1 media indexing logic
|
||||
- `tests/test_know_thy_father_synthesis.py` — Phase 3 kernel synthesis
|
||||
- `tests/test_know_thy_father_crossref.py` — Phase 4 cross-reference audit
|
||||
- `tests/twitter_archive/test_ktf_tracker.py` — Phase 5 processing tracker
|
||||
- `tests/twitter_archive/test_analyze_media.py` — Phase 2 multimodal analysis
|
||||
|
||||
Run all with:
|
||||
|
||||
```bash
|
||||
python3 -m pytest tests/test_know_thy_father_pipeline.py tests/test_know_thy_father_index.py tests/test_know_thy_father_synthesis.py tests/test_know_thy_father_crossref.py tests/twitter_archive/test_ktf_tracker.py tests/twitter_archive/test_analyze_media.py -q
|
||||
```
|
||||
|
||||
## Why Refs #582, not Closes #582
|
||||
|
||||
The **repo-side orchestration slice** is fully implemented on `main`. However, the
|
||||
parent epic itself remains open because:
|
||||
|
||||
1. The local Twitter archive has not been fully consumed through all five phases.
|
||||
2. Downstream memory/fact-store integration is not yet wired end-to-end.
|
||||
3. The processing log (`PROCESSING_LOG.md`) reflects halted progress that has not resumed.
|
||||
|
||||
This PR adds durable verification evidence without overstating closure.
|
||||
|
||||
## Historical trail
|
||||
|
||||
- Parent-epic PR that landed the orchestration slice: [closed on main]
|
||||
- This verification document: added by #789, superseded by this PR #790.
|
||||
|
||||
## Linked issues
|
||||
|
||||
- Refs #582 (parent epic — remains open)
|
||||
- Closes #789 (verification task — closed by this PR)
|
||||
146
tests/test_issue_582_verification.py
Normal file
146
tests/test_issue_582_verification.py
Normal file
@@ -0,0 +1,146 @@
|
||||
"""Durable verification that the Issue #582 parent-epic orchestration slice exists on main.
|
||||
|
||||
These tests confirm:
|
||||
1. The epic pipeline runner script is present and importable.
|
||||
2. The pipeline documentation is committed.
|
||||
3. All five phase scripts exist at their expected paths.
|
||||
4. The pipeline plan exposes the correct five phases in order.
|
||||
5. Each plan step references the correct underlying script.
|
||||
6. The status snapshot reports script_exists=True for all phases.
|
||||
7. The status snapshot includes expected artifact output paths.
|
||||
8. The runner can produce a JSON-serialisable plan.
|
||||
9. The runner can produce a JSON-serialisable status snapshot.
|
||||
10. The verification document itself is present.
|
||||
|
||||
Refs #582. Closes #789.
|
||||
"""
|
||||
|
||||
import importlib.util
|
||||
import json
|
||||
import unittest
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
ROOT = Path(__file__).resolve().parent.parent
|
||||
EPIC_PIPELINE = ROOT / "scripts" / "know_thy_father" / "epic_pipeline.py"
|
||||
PIPELINE_DOC = ROOT / "docs" / "KNOW_THY_FATHER_MULTIMODAL_PIPELINE.md"
|
||||
VERIFICATION_DOC = ROOT / "docs" / "issue-582-verification.md"
|
||||
|
||||
EXPECTED_PHASES = [
|
||||
"phase1_media_indexing",
|
||||
"phase2_multimodal_analysis",
|
||||
"phase3_holographic_synthesis",
|
||||
"phase4_cross_reference_audit",
|
||||
"phase5_processing_log",
|
||||
]
|
||||
|
||||
EXPECTED_SCRIPTS = {
|
||||
"phase1_media_indexing": "scripts/know_thy_father/index_media.py",
|
||||
"phase2_multimodal_analysis": "scripts/twitter_archive/analyze_media.py",
|
||||
"phase3_holographic_synthesis": "scripts/know_thy_father/synthesize_kernels.py",
|
||||
"phase4_cross_reference_audit": "scripts/know_thy_father/crossref_audit.py",
|
||||
"phase5_processing_log": "twitter-archive/know-thy-father/tracker.py",
|
||||
}
|
||||
|
||||
EXPECTED_OUTPUTS = {
|
||||
"phase1_media_indexing": ["twitter-archive/know-thy-father/media_manifest.jsonl"],
|
||||
"phase3_holographic_synthesis": ["twitter-archive/knowledge/fathers_ledger.jsonl"],
|
||||
"phase5_processing_log": ["twitter-archive/know-thy-father/REPORT.md"],
|
||||
}
|
||||
|
||||
|
||||
def _load_epic_module():
|
||||
spec = importlib.util.spec_from_file_location("ktf_epic_pipeline", EPIC_PIPELINE)
|
||||
assert spec and spec.loader, "Cannot load epic_pipeline module spec"
|
||||
mod = importlib.util.module_from_spec(spec)
|
||||
spec.loader.exec_module(mod)
|
||||
return mod
|
||||
|
||||
|
||||
class TestIssue582Verification(unittest.TestCase):
|
||||
"""10-test suite proving the #582 orchestration slice is on main."""
|
||||
|
||||
# -- existence checks --------------------------------------------------
|
||||
|
||||
def test_01_epic_pipeline_script_exists(self):
|
||||
"""The orchestration runner is committed."""
|
||||
self.assertTrue(EPIC_PIPELINE.exists(), f"missing {EPIC_PIPELINE.relative_to(ROOT)}")
|
||||
|
||||
def test_02_pipeline_documentation_exists(self):
|
||||
"""The multimodal pipeline doc is committed."""
|
||||
self.assertTrue(PIPELINE_DOC.exists(), "missing KNOW_THY_FATHER_MULTIMODAL_PIPELINE.md")
|
||||
|
||||
def test_03_all_phase_scripts_exist_on_disk(self):
|
||||
"""Every script referenced by the pipeline exists in the repo."""
|
||||
for phase_id, script_rel in EXPECTED_SCRIPTS.items():
|
||||
path = ROOT / script_rel
|
||||
self.assertTrue(path.exists(), f"{phase_id}: missing {script_rel}")
|
||||
|
||||
# -- plan structure ----------------------------------------------------
|
||||
|
||||
def test_04_pipeline_plan_has_five_phases_in_order(self):
|
||||
mod = _load_epic_module()
|
||||
plan = mod.build_pipeline_plan(batch_size=10)
|
||||
ids = [step["id"] for step in plan]
|
||||
self.assertEqual(ids, EXPECTED_PHASES)
|
||||
|
||||
def test_05_plan_commands_reference_correct_scripts(self):
|
||||
mod = _load_epic_module()
|
||||
plan = mod.build_pipeline_plan(batch_size=10)
|
||||
for step in plan:
|
||||
expected_script = EXPECTED_SCRIPTS[step["id"]]
|
||||
self.assertIn(
|
||||
expected_script,
|
||||
step["command"],
|
||||
f"{step['id']} command missing {expected_script}",
|
||||
)
|
||||
|
||||
# -- status snapshot ---------------------------------------------------
|
||||
|
||||
def test_06_status_snapshot_all_scripts_exist(self):
|
||||
mod = _load_epic_module()
|
||||
status = mod.build_status_snapshot(ROOT)
|
||||
for phase_id in EXPECTED_PHASES:
|
||||
self.assertIn(phase_id, status)
|
||||
self.assertTrue(
|
||||
status[phase_id]["script_exists"],
|
||||
f"{phase_id} script_exists should be True",
|
||||
)
|
||||
|
||||
def test_07_status_snapshot_reports_expected_outputs(self):
|
||||
mod = _load_epic_module()
|
||||
status = mod.build_status_snapshot(ROOT)
|
||||
for phase_id, expected_paths in EXPECTED_OUTPUTS.items():
|
||||
actual_paths = [o["path"] for o in status[phase_id]["outputs"]]
|
||||
for p in expected_paths:
|
||||
self.assertIn(p, actual_paths, f"{phase_id} missing output path {p}")
|
||||
|
||||
# -- JSON serialisation ------------------------------------------------
|
||||
|
||||
def test_08_plan_is_json_serialisable(self):
|
||||
mod = _load_epic_module()
|
||||
plan = mod.build_pipeline_plan(batch_size=10)
|
||||
dumped = json.dumps(plan)
|
||||
restored = json.loads(dumped)
|
||||
self.assertEqual(len(restored), 5)
|
||||
|
||||
def test_09_status_snapshot_is_json_serialisable(self):
|
||||
mod = _load_epic_module()
|
||||
status = mod.build_status_snapshot(ROOT)
|
||||
dumped = json.dumps(status)
|
||||
restored = json.loads(dumped)
|
||||
for phase_id in EXPECTED_PHASES:
|
||||
self.assertIn(phase_id, restored)
|
||||
|
||||
# -- verification doc --------------------------------------------------
|
||||
|
||||
def test_10_verification_document_exists(self):
|
||||
"""This verification trail is committed."""
|
||||
self.assertTrue(
|
||||
VERIFICATION_DOC.exists(),
|
||||
"missing docs/issue-582-verification.md",
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
Reference in New Issue
Block a user