Compare commits

..

1 Commits

Author SHA1 Message Date
Alexander Whitestone
6cbb9a98e1 fix: refresh burn lane audit for #662
Some checks failed
Agent PR Gate / gate (pull_request) Failing after 48s
Self-Healing Smoke / self-healing-smoke (pull_request) Failing after 8s
Smoke Test / smoke (pull_request) Failing after 6s
Agent PR Gate / report (pull_request) Has been cancelled
2026-04-16 23:43:24 -04:00
6 changed files with 151 additions and 474 deletions

View File

@@ -1,14 +0,0 @@
fleet_name: timmy-phase-3-config
targets:
- host: ezra
config_root: /root/wizards/ezra/home/.hermes
files:
- config.yaml
- dispatch/rules.json
- host: bezalel
config_root: /root/wizards/bezalel/home/.hermes
files:
- config.yaml
- dispatch/rules.json

View File

@@ -1,6 +1,6 @@
# Burn Lane Empty Audit — timmy-home #662
Generated: 2026-04-16T01:22:37Z
Generated: 2026-04-17T03:42:50Z
Source issue: `[ops] Burn lane empty — all open issues triaged (2026-04-14)`
## Source Snapshot
@@ -11,9 +11,9 @@ Issue #662 is an operational status note, not a normal feature request. Its body
- Referenced issues audited: 42
- Already closed: 30
- Open but likely closure candidates (merged PR found): 0
- Open with active PRs: 12
- Open / needs manual review: 0
- Open but likely closure candidates (merged PR found): 1
- Open with active PRs: 0
- Open / needs manual review: 11
## Issue Body Drift
@@ -21,56 +21,56 @@ The body of #662 is not current truth. It mixes closed issues, open issues, rang
| Issue | State | Classification | PR Summary |
|---|---|---|---|
| #579 | closed | already closed | closed PR #644, closed PR #640, closed PR #635, closed PR #620 |
| #648 | open | active pr | open PR #731 |
| #579 | closed | already closed | closed PR #644, closed PR #643, closed PR #640, closed PR #635, closed PR #620 |
| #648 | open | needs manual review | closed PR #731 |
| #647 | closed | already closed | issue already closed |
| #619 | closed | already closed | issue already closed |
| #616 | closed | already closed | issue already closed |
| #614 | closed | already closed | issue already closed |
| #613 | closed | already closed | issue already closed |
| #660 | closed | already closed | issue already closed |
| #659 | closed | already closed | issue already closed |
| #659 | closed | already closed | closed PR #660 |
| #658 | closed | already closed | issue already closed |
| #657 | closed | already closed | issue already closed |
| #656 | closed | already closed | closed PR #658 |
| #655 | closed | already closed | issue already closed |
| #654 | closed | already closed | closed PR #661 |
| #653 | closed | already closed | issue already closed |
| #652 | closed | already closed | merged PR #657 |
| #651 | closed | already closed | issue already closed |
| #650 | closed | already closed | merged PR #654 |
| #649 | closed | already closed | issue already closed |
| #646 | closed | already closed | issue already closed |
| #582 | open | active pr | open PR #738 |
| #653 | closed | already closed | closed PR #660, closed PR #655 |
| #652 | closed | already closed | closed PR #660, merged PR #657, closed PR #655 |
| #651 | closed | already closed | closed PR #655 |
| #650 | closed | already closed | closed PR #661, closed PR #660, merged PR #654, closed PR #651 |
| #649 | closed | already closed | closed PR #660, merged PR #657, closed PR #651 |
| #646 | closed | already closed | closed PR #655, closed PR #651 |
| #582 | open | closure candidate | merged PR #641, merged PR #639, merged PR #637, merged PR #631, merged PR #630 |
| #627 | closed | already closed | issue already closed |
| #631 | closed | already closed | issue already closed |
| #632 | closed | already closed | issue already closed |
| #634 | closed | already closed | issue already closed |
| #639 | closed | already closed | issue already closed |
| #641 | closed | already closed | issue already closed |
| #575 | closed | already closed | merged PR #656 |
| #576 | closed | already closed | closed PR #663, closed PR #660, closed PR #655, closed PR #651, closed PR #646, closed PR #642, closed PR #633 |
| #575 | closed | already closed | closed PR #658, merged PR #656 |
| #576 | closed | already closed | merged PR #664, closed PR #663, closed PR #660, closed PR #655, merged PR #654, closed PR #651, closed PR #646, closed PR #642, closed PR #633 |
| #578 | closed | already closed | merged PR #638, closed PR #636 |
| #636 | closed | already closed | issue already closed |
| #638 | closed | already closed | issue already closed |
| #547 | open | active pr | open PR #730 |
| #548 | open | active pr | open PR #712 |
| #549 | open | active pr | open PR #729 |
| #550 | open | active pr | open PR #727 |
| #551 | open | active pr | open PR #725 |
| #552 | open | active pr | open PR #724 |
| #553 | open | active pr | open PR #722 |
| #562 | open | active pr | open PR #718 |
| #544 | open | active pr | open PR #732 |
| #545 | open | active pr | open PR #719 |
| #547 | open | needs manual review | closed PR #730 |
| #548 | open | needs manual review | closed PR #712 |
| #549 | open | needs manual review | closed PR #729 |
| #550 | open | needs manual review | closed PR #727 |
| #551 | open | needs manual review | closed PR #725 |
| #552 | open | needs manual review | closed PR #724 |
| #553 | open | needs manual review | closed PR #722 |
| #562 | open | needs manual review | closed PR #718 |
| #544 | open | needs manual review | closed PR #732 |
| #545 | open | needs manual review | closed PR #719 |
## Closure Candidates
These issues are still open but already have merged PR evidence in the forge and should be reviewed for bulk closure.
| None |
|---|
| None |
| Issue | State | Classification | PR Summary |
|---|---|---|---|
| #582 | open | closure candidate | merged PR #641, merged PR #639, merged PR #637, merged PR #631, merged PR #630 |
## Still Open / Needs Manual Review
@@ -78,18 +78,17 @@ These issues either have no matching PR signal or still have an active PR / ambi
| Issue | State | Classification | PR Summary |
|---|---|---|---|
| #648 | open | active pr | open PR #731 |
| #582 | open | active pr | open PR #738 |
| #547 | open | active pr | open PR #730 |
| #548 | open | active pr | open PR #712 |
| #549 | open | active pr | open PR #729 |
| #550 | open | active pr | open PR #727 |
| #551 | open | active pr | open PR #725 |
| #552 | open | active pr | open PR #724 |
| #553 | open | active pr | open PR #722 |
| #562 | open | active pr | open PR #718 |
| #544 | open | active pr | open PR #732 |
| #545 | open | active pr | open PR #719 |
| #648 | open | needs manual review | closed PR #731 |
| #547 | open | needs manual review | closed PR #730 |
| #548 | open | needs manual review | closed PR #712 |
| #549 | open | needs manual review | closed PR #729 |
| #550 | open | needs manual review | closed PR #727 |
| #551 | open | needs manual review | closed PR #725 |
| #552 | open | needs manual review | closed PR #724 |
| #553 | open | needs manual review | closed PR #722 |
| #562 | open | needs manual review | closed PR #718 |
| #544 | open | needs manual review | closed PR #732 |
| #545 | open | needs manual review | closed PR #719 |
## Recommendation

View File

@@ -23,6 +23,7 @@ class PullSummary:
state: str
merged: bool
head: str
body: str
url: str
@@ -75,7 +76,8 @@ def api_get(path: str, token: str):
def collect_pull_summaries(repo: str, token: str) -> list[PullSummary]:
pulls: list[PullSummary] = []
for state in ("open", "closed"):
for page in range(1, 6):
page = 1
while True:
batch = api_get(f"/repos/{ORG}/{repo}/pulls?state={state}&limit=100&page={page}", token)
if not batch:
break
@@ -87,18 +89,18 @@ def collect_pull_summaries(repo: str, token: str) -> list[PullSummary]:
state=pr.get("state") or state,
merged=bool(pr.get("merged")),
head=(pr.get("head") or {}).get("ref") or "",
body=pr.get("body") or "",
url=pr.get("html_url") or pr.get("url") or "",
)
)
if len(batch) < 100:
break
page += 1
return pulls
def match_prs(issue_num: int, pulls: Iterable[PullSummary]) -> list[PullSummary]:
matches: list[PullSummary] = []
for pr in pulls:
text = f"{pr.title} {pr.head}"
text = f"{pr.title} {pr.head} {pr.body}"
if f"#{issue_num}" in text or pr.head == f"fix/{issue_num}" or f"/{issue_num}" in pr.head or f"-{issue_num}" in pr.head:
matches.append(pr)
return matches
@@ -116,12 +118,16 @@ def classify_issue(issue: dict, related_prs: list[PullSummary]) -> IssueAuditRow
else:
merged = [pr for pr in related_prs if pr.merged]
open_prs = [pr for pr in related_prs if pr.state == "open"]
closed_unmerged = [pr for pr in related_prs if pr.state != "open" and not pr.merged]
if merged:
classification = "closure_candidate"
pr_summary = summarize_prs(merged)
elif open_prs:
classification = "active_pr"
pr_summary = summarize_prs(open_prs)
elif closed_unmerged:
classification = "needs_manual_review"
pr_summary = summarize_prs(closed_unmerged)
else:
classification = "needs_manual_review"
pr_summary = "no matching PR found"

View File

@@ -1,292 +0,0 @@
#!/usr/bin/env python3
"""Plan atomic fleet config sync releases.
Refs: timmy-home #550
Phase-3 orchestration slice:
- define a shared config-sync manifest for fleet hosts
- fingerprint the exact config payload into one release id
- generate per-host staging paths and atomic symlink-swap promotion metadata
- stay dry-run by default so rollout planning is safe to verify locally
"""
from __future__ import annotations
import argparse
import hashlib
import json
from pathlib import Path
from typing import Any
import yaml
DEFAULT_INVENTORY_FILE = Path(__file__).resolve().parents[1] / "ansible" / "inventory" / "hosts.ini"
def load_inventory_hosts(path: str | Path) -> dict[str, dict[str, str]]:
hosts: dict[str, dict[str, str]] = {}
section = None
for raw_line in Path(path).read_text(encoding="utf-8").splitlines():
line = raw_line.strip()
if not line or line.startswith("#") or line.startswith(";"):
continue
if line.startswith("[") and line.endswith("]"):
section = line[1:-1].strip().lower()
continue
if section != "fleet":
continue
parts = line.split()
host = parts[0]
metadata = {"host": host}
for token in parts[1:]:
if "=" not in token:
continue
key, value = token.split("=", 1)
metadata[key] = value
hosts[host] = metadata
if not hosts:
raise ValueError("inventory defines no [fleet] hosts")
return hosts
def load_manifest(path: str | Path) -> dict[str, Any]:
data = yaml.safe_load(Path(path).read_text(encoding="utf-8")) or {}
if not isinstance(data, dict):
raise ValueError("manifest must contain a YAML object")
data.setdefault("fleet_name", "timmy-fleet-config")
data.setdefault("targets", [])
if not isinstance(data["targets"], list):
raise ValueError("targets must be a list")
return data
def _normalize_relative_path(value: str) -> str:
path = Path(value)
if path.is_absolute():
raise ValueError(f"sync file path must be relative: {value}")
if any(part == ".." for part in path.parts):
raise ValueError(f"sync file path may not escape source root: {value}")
normalized = path.as_posix()
if normalized in {"", "."}:
raise ValueError("sync file path may not be empty")
return normalized
def validate_manifest(manifest: dict[str, Any], inventory_hosts: dict[str, dict[str, str]]) -> None:
targets = manifest.get("targets", [])
if not targets:
raise ValueError("manifest must define at least one sync target")
seen_hosts: set[str] = set()
for target in targets:
if not isinstance(target, dict):
raise ValueError("each target must be a mapping")
host = str(target.get("host", "")).strip()
if not host:
raise ValueError("each target must declare a host")
if host in seen_hosts:
raise ValueError(f"duplicate target host: {host}")
if host not in inventory_hosts:
raise ValueError(f"unknown inventory host: {host}")
seen_hosts.add(host)
config_root = str(target.get("config_root", "")).strip()
if not config_root:
raise ValueError(f"target {host} missing config_root")
files = target.get("files")
if not isinstance(files, list) or not files:
raise ValueError(f"target {host} must declare at least one file")
normalized: list[str] = []
for entry in files:
normalized.append(_normalize_relative_path(str(entry)))
if len(set(normalized)) != len(normalized):
raise ValueError(f"target {host} declares duplicate file paths")
def _hash_file(path: Path) -> str:
return hashlib.sha256(path.read_bytes()).hexdigest()
def _collect_target_files(source_root: Path, rel_paths: list[str]) -> list[dict[str, Any]]:
items: list[dict[str, Any]] = []
for rel_path in sorted(_normalize_relative_path(path) for path in rel_paths):
source_path = source_root / rel_path
if not source_path.exists():
raise FileNotFoundError(f"missing source file: {rel_path}")
if not source_path.is_file():
raise ValueError(f"sync source must be a file: {rel_path}")
items.append(
{
"relative_path": rel_path,
"source": str(source_path),
"sha256": _hash_file(source_path),
"size": source_path.stat().st_size,
}
)
return items
def compute_release_id(target_payloads: list[dict[str, Any]]) -> str:
digest = hashlib.sha256()
for target in sorted(target_payloads, key=lambda item: item["host"]):
digest.update(target["host"].encode("utf-8"))
digest.update(b"\0")
for file_item in sorted(target["files"], key=lambda item: item["relative_path"]):
digest.update(file_item["relative_path"].encode("utf-8"))
digest.update(b"\0")
digest.update(file_item["sha256"].encode("utf-8"))
digest.update(b"\0")
digest.update(str(file_item["size"]).encode("utf-8"))
digest.update(b"\0")
return digest.hexdigest()[:12]
def build_rollout_plan(
manifest: dict[str, Any],
inventory_hosts: dict[str, dict[str, str]],
*,
source_root: str | Path,
) -> dict[str, Any]:
validate_manifest(manifest, inventory_hosts)
source_root = Path(source_root)
if not source_root.exists():
raise FileNotFoundError(f"source root not found: {source_root}")
if not source_root.is_dir():
raise ValueError(f"source root must be a directory: {source_root}")
staged_targets: list[dict[str, Any]] = []
for target in sorted(manifest["targets"], key=lambda item: item["host"]):
host = target["host"]
files = _collect_target_files(source_root, target["files"])
staged_targets.append(
{
"host": host,
"inventory": inventory_hosts[host],
"config_root": str(target["config_root"]),
"files": files,
}
)
release_id = compute_release_id(staged_targets)
total_bytes = 0
file_count = 0
rendered_targets: list[dict[str, Any]] = []
for target in staged_targets:
config_root = target["config_root"].rstrip("/")
stage_root = f"{config_root}/.releases/{release_id}"
live_symlink = f"{config_root}/current"
previous_symlink = f"{config_root}/previous"
file_count += len(target["files"])
total_bytes += sum(item["size"] for item in target["files"])
rendered_targets.append(
{
"host": target["host"],
"ansible_host": target["inventory"].get("ansible_host", ""),
"ansible_user": target["inventory"].get("ansible_user", ""),
"config_root": config_root,
"stage_root": stage_root,
"live_symlink": live_symlink,
"previous_symlink": previous_symlink,
"files": target["files"],
"promote": {
"mode": "symlink_swap",
"release_id": release_id,
"from": stage_root,
"to": live_symlink,
"backup_link": previous_symlink,
},
}
)
return {
"fleet_name": manifest.get("fleet_name", "timmy-fleet-config"),
"source_root": str(source_root),
"release_id": release_id,
"target_count": len(rendered_targets),
"file_count": file_count,
"total_bytes": total_bytes,
"targets": rendered_targets,
}
def render_markdown(plan: dict[str, Any]) -> str:
lines = [
"# Fleet Config Sync Plan",
"",
f"Fleet: {plan['fleet_name']}",
f"Release ID: `{plan['release_id']}`",
f"Source root: `{plan['source_root']}`",
f"Target count: {plan['target_count']}",
f"File count: {plan['file_count']}",
f"Total bytes: {plan['total_bytes']}",
"",
"Atomic promote via symlink swap keeps every host on one named release boundary.",
"",
"| Host | Address | Stage root | Live symlink | Files |",
"|---|---|---|---|---:|",
]
for target in plan["targets"]:
lines.append(
f"| {target['host']} | {target['ansible_host'] or 'n/a'} | `{target['stage_root']}` | `{target['live_symlink']}` | {len(target['files'])} |"
)
lines.extend(["", "## Target file manifests", ""])
for target in plan["targets"]:
lines.extend(
[
f"### {target['host']}",
"",
f"- Promote: `{target['promote']['from']}` -> `{target['promote']['to']}`",
f"- Backup link: `{target['promote']['backup_link']}`",
"",
"| Relative path | Bytes | SHA256 |",
"|---|---:|---|",
]
)
for file_item in target["files"]:
lines.append(
f"| `{file_item['relative_path']}` | {file_item['size']} | `{file_item['sha256'][:16]}…` |"
)
lines.append("")
return "\n".join(lines).rstrip() + "\n"
def main() -> int:
parser = argparse.ArgumentParser(description="Plan a dry-run atomic config sync release across fleet hosts")
parser.add_argument("manifest", help="Path to fleet config sync manifest YAML")
parser.add_argument("--inventory", default=str(DEFAULT_INVENTORY_FILE), help="Path to Ansible fleet inventory")
parser.add_argument("--source-root", default=".", help="Local source root containing files listed in the manifest")
parser.add_argument("--markdown", action="store_true", help="Render markdown instead of JSON")
args = parser.parse_args()
inventory = load_inventory_hosts(args.inventory)
manifest = load_manifest(args.manifest)
plan = build_rollout_plan(manifest, inventory, source_root=args.source_root)
if args.markdown:
print(render_markdown(plan))
else:
print(json.dumps(plan, indent=2))
return 0
if __name__ == "__main__":
raise SystemExit(main())

View File

@@ -1,6 +1,13 @@
from pathlib import Path
from scripts.burn_lane_issue_audit import extract_issue_numbers, render_report
from scripts.burn_lane_issue_audit import (
PullSummary,
classify_issue,
collect_pull_summaries,
extract_issue_numbers,
match_prs,
render_report,
)
def test_extract_issue_numbers_handles_ranges_and_literals() -> None:
@@ -14,6 +21,99 @@ def test_extract_issue_numbers_handles_ranges_and_literals() -> None:
assert extract_issue_numbers(body) == [579, 660, 659, 658, 582, 627, 631, 547, 546, 545]
def test_match_prs_detects_issue_ref_in_pr_body() -> None:
pulls = [
PullSummary(
number=731,
title="docs: verify session harvest report",
state="open",
merged=False,
head="fix/session-harvest-report",
body="Refs #648",
url="https://forge.example/pr/731",
),
PullSummary(
number=732,
title="unrelated",
state="open",
merged=False,
head="fix/unrelated",
body="Refs #700",
url="https://forge.example/pr/732",
),
]
assert [pr.number for pr in match_prs(648, pulls)] == [731]
def test_open_issue_with_closed_unmerged_pr_stays_manual_review_with_history() -> None:
issue = {
"number": 648,
"title": "session harvest report",
"state": "open",
"html_url": "https://forge.example/issues/648",
}
row = classify_issue(
issue,
[
PullSummary(
number=731,
title="docs: add session harvest report",
state="closed",
merged=False,
head="fix/648",
body="Closes #648",
url="https://forge.example/pr/731",
)
],
)
assert row.classification == "needs_manual_review"
assert row.pr_summary == "closed PR #731"
def test_collect_pull_summaries_pages_until_empty(monkeypatch) -> None:
def fake_api_get(path: str, token: str):
if "state=open" in path:
return []
page = int(path.split("page=")[1])
if page <= 5:
return [
{
"number": page * 1000 + i,
"title": f"page {page} pr {i}",
"state": "closed",
"merged": False,
"head": {"ref": f"fix/{page}-{i}"},
"body": f"Refs #{page * 1000 + i}",
"html_url": f"https://forge.example/pr/{page * 1000 + i}",
}
for i in range(100)
]
if page == 6:
return [
{
"number": 900,
"title": "late page pr",
"state": "closed",
"merged": False,
"head": {"ref": "fix/900"},
"body": "Refs #900",
"html_url": "https://forge.example/pr/900",
}
]
return []
monkeypatch.setattr("scripts.burn_lane_issue_audit.api_get", fake_api_get)
pulls = collect_pull_summaries("timmy-home", "token")
assert any(pr.number == 900 for pr in pulls)
def test_render_report_calls_out_drift_and_candidates() -> None:
rows = [
{

View File

@@ -1,122 +0,0 @@
from __future__ import annotations
import importlib.util
from pathlib import Path
ROOT = Path(__file__).resolve().parents[1]
SCRIPT_PATH = ROOT / "scripts" / "fleet_config_sync.py"
HOSTS_FILE = ROOT / "ansible" / "inventory" / "hosts.ini"
EXAMPLE_MANIFEST = ROOT / "docs" / "fleet-config-sync.example.yaml"
def _load_module(path: Path, name: str):
assert path.exists(), f"missing {path.relative_to(ROOT)}"
spec = importlib.util.spec_from_file_location(name, path)
assert spec and spec.loader
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return module
def _write_source_tree(tmp_path: Path) -> Path:
source_root = tmp_path / "source"
(source_root / "dispatch").mkdir(parents=True)
(source_root / "config.yaml").write_text("model: local\nroute: hybrid\n", encoding="utf-8")
(source_root / "dispatch" / "rules.json").write_text('{"lane":"allegro"}\n', encoding="utf-8")
return source_root
def test_example_manifest_targets_known_fleet_hosts() -> None:
mod = _load_module(SCRIPT_PATH, "fleet_config_sync")
assert EXAMPLE_MANIFEST.exists(), "missing docs/fleet-config-sync.example.yaml"
inventory = mod.load_inventory_hosts(HOSTS_FILE)
manifest = mod.load_manifest(EXAMPLE_MANIFEST)
mod.validate_manifest(manifest, inventory)
assert [target["host"] for target in manifest["targets"]] == ["ezra", "bezalel"]
def test_build_rollout_plan_stages_one_release_for_all_hosts(tmp_path: Path) -> None:
mod = _load_module(SCRIPT_PATH, "fleet_config_sync")
source_root = _write_source_tree(tmp_path)
inventory = {
"ezra": {"host": "ezra", "ansible_host": "143.198.27.163"},
"bezalel": {"host": "bezalel", "ansible_host": "67.205.155.108"},
}
manifest = {
"fleet_name": "phase-3-config-sync",
"targets": [
{
"host": "ezra",
"config_root": "/root/wizards/ezra/home/.hermes",
"files": ["config.yaml", "dispatch/rules.json"],
},
{
"host": "bezalel",
"config_root": "/root/wizards/bezalel/home/.hermes",
"files": ["config.yaml", "dispatch/rules.json"],
},
],
}
plan = mod.build_rollout_plan(manifest, inventory, source_root=source_root)
assert plan["fleet_name"] == "phase-3-config-sync"
assert len(plan["release_id"]) == 12
assert plan["target_count"] == 2
assert plan["file_count"] == 4
assert plan["total_bytes"] > 0
for target in plan["targets"]:
assert target["stage_root"].endswith(f"/.releases/{plan['release_id']}")
assert target["live_symlink"].endswith("/current")
assert target["promote"]["release_id"] == plan["release_id"]
assert {item["relative_path"] for item in target["files"]} == {"config.yaml", "dispatch/rules.json"}
def test_validate_manifest_rejects_unknown_inventory_host(tmp_path: Path) -> None:
mod = _load_module(SCRIPT_PATH, "fleet_config_sync")
source_root = _write_source_tree(tmp_path)
manifest = {
"targets": [
{
"host": "unknown-wizard",
"config_root": "/srv/wizard/config",
"files": ["config.yaml"],
}
]
}
try:
mod.build_rollout_plan(manifest, {"ezra": {"host": "ezra"}}, source_root=source_root)
except ValueError as exc:
assert "unknown inventory host" in str(exc)
assert "unknown-wizard" in str(exc)
else:
raise AssertionError("build_rollout_plan should reject unknown inventory hosts")
def test_render_markdown_mentions_atomic_promote_and_targets(tmp_path: Path) -> None:
mod = _load_module(SCRIPT_PATH, "fleet_config_sync")
source_root = _write_source_tree(tmp_path)
manifest = {
"fleet_name": "phase-3-config-sync",
"targets": [
{
"host": "ezra",
"config_root": "/root/wizards/ezra/home/.hermes",
"files": ["config.yaml"],
}
],
}
inventory = {"ezra": {"host": "ezra", "ansible_host": "143.198.27.163"}}
plan = mod.build_rollout_plan(manifest, inventory, source_root=source_root)
report = mod.render_markdown(plan)
assert plan["release_id"] in report
assert "Atomic promote via symlink swap" in report
assert "ezra" in report
assert "/root/wizards/ezra/home/.hermes/current" in report