Compare commits
4 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
212a5befba | ||
|
|
1eaf6df7d0 | ||
|
|
735fafa235 | ||
|
|
e53d54f873 |
18
configs/burn_velocity_repos.json
Normal file
18
configs/burn_velocity_repos.json
Normal file
@@ -0,0 +1,18 @@
|
||||
{
|
||||
"owner": "Timmy_Foundation",
|
||||
"repos": [
|
||||
"timmy-home",
|
||||
"timmy-config",
|
||||
"fleet-ops",
|
||||
"the-beacon",
|
||||
"the-door",
|
||||
"the-nexus"
|
||||
],
|
||||
"lookback_days": 14,
|
||||
"alert": {
|
||||
"recent_days": 7,
|
||||
"baseline_days": 7,
|
||||
"minimum_baseline_closed": 4,
|
||||
"drop_ratio": 0.6
|
||||
}
|
||||
}
|
||||
70
docs/BURN_VELOCITY_TRACKING.md
Normal file
70
docs/BURN_VELOCITY_TRACKING.md
Normal file
@@ -0,0 +1,70 @@
|
||||
# Burn-down Velocity Tracking
|
||||
|
||||
Refs #519.
|
||||
|
||||
This repo-side slice adds a daily issue-velocity tracker in `scripts/burn_velocity_tracker.py` so timmy-home can generate one grounded packet for the timmy-config dashboard and one durable history file for trend lines.
|
||||
|
||||
## What it emits
|
||||
|
||||
Daily run outputs:
|
||||
- `~/.timmy/burn-velocity/latest.json` — machine-readable payload for the timmy-config dashboard
|
||||
- `~/.timmy/burn-velocity/latest.md` — operator-facing markdown summary
|
||||
- `~/.timmy/burn-velocity/history.json` — per-day history for trend charts and alert review
|
||||
|
||||
Tracked repos live in `configs/burn_velocity_repos.json`.
|
||||
|
||||
## Cron command
|
||||
|
||||
```bash
|
||||
cd ~/timmy-home && \
|
||||
python3 scripts/burn_velocity_tracker.py \
|
||||
--config configs/burn_velocity_repos.json \
|
||||
--output-json ~/.timmy/burn-velocity/latest.json \
|
||||
--output-md ~/.timmy/burn-velocity/latest.md \
|
||||
--history-file ~/.timmy/burn-velocity/history.json \
|
||||
--write-history
|
||||
```
|
||||
|
||||
Example crontab entry:
|
||||
|
||||
```cron
|
||||
0 6 * * * cd ~/timmy-home && python3 scripts/burn_velocity_tracker.py --config configs/burn_velocity_repos.json --output-json ~/.timmy/burn-velocity/latest.json --output-md ~/.timmy/burn-velocity/latest.md --history-file ~/.timmy/burn-velocity/history.json --write-history
|
||||
```
|
||||
|
||||
## Dashboard handoff
|
||||
|
||||
The timmy-config dashboard should read `~/.timmy/burn-velocity/latest.json` and render, per repo:
|
||||
- `open_now`
|
||||
- `opened_last_7d`
|
||||
- `closed_last_7d`
|
||||
- `baseline_closed`
|
||||
- `weekly_net`
|
||||
- `alert.status`
|
||||
- `alert.kind`
|
||||
- `alert.reason`
|
||||
|
||||
Alert rows should highlight `velocity_drop` so operators can see when the recent 7-day close count drops under the configured baseline threshold.
|
||||
|
||||
## Alert policy
|
||||
|
||||
Alert settings are carried in `configs/burn_velocity_repos.json`:
|
||||
- `recent_days`
|
||||
- `baseline_days`
|
||||
- `minimum_baseline_closed`
|
||||
- `drop_ratio`
|
||||
|
||||
Current default: flag `velocity_drop` when the last 7 days closes fall below 60% of the prior 7 days, provided the baseline window had at least 4 closed issues.
|
||||
|
||||
## Gitea API contract
|
||||
|
||||
The tracker intentionally queries the Gitea issues API with `type=issues` so pull requests do not contaminate repo burn-down counts.
|
||||
|
||||
Live collection shape:
|
||||
- open backlog uses `/repos/{owner}/{repo}/issues?state=open&type=issues`
|
||||
- recent event scan uses `/repos/{owner}/{repo}/issues?state=all&type=issues&since=...`
|
||||
|
||||
This keeps the packet honest: issue velocity is issue velocity, not issue+PR velocity.
|
||||
|
||||
## Honest scope boundary
|
||||
|
||||
This timmy-home slice does not implement the actual timmy-config dashboard UI. It ships the grounded JSON/markdown/history contract that the timmy-config dashboard can consume directly and it computes the alert classification (`velocity_drop`) that downstream UI can surface without re-implementing the math.
|
||||
@@ -1,51 +0,0 @@
|
||||
# Bezalel Gemma 4 VPS Wiring
|
||||
|
||||
Issue: timmy-home #544
|
||||
|
||||
This helper is the repo-side operator bundle for wiring a live Gemma 4 endpoint into Bezalel's VPS config without hardcoding one dead pod forever.
|
||||
|
||||
What `scripts/bezalel_gemma4_vps.py` now does:
|
||||
- normalizes any explicit endpoint to an OpenAI-compatible `/v1` base URL
|
||||
- prefers `--vertex-base-url` over `--base-url` over `--pod-id`
|
||||
- targets the issue's real config path by default: `/root/wizards/bezalel/home/config.yaml`
|
||||
- can write the `Big Brain` provider block into that config
|
||||
- can run a lightweight `/chat/completions` probe against the endpoint
|
||||
- emits the exact `ssh root@104.131.15.18 ... curl ...` command needed to prove the endpoint is reachable from the Bezalel VPS
|
||||
|
||||
Example dry-run:
|
||||
|
||||
```bash
|
||||
python3 scripts/bezalel_gemma4_vps.py \
|
||||
--base-url https://<pod-id>-11434.proxy.runpod.net \
|
||||
--json
|
||||
```
|
||||
|
||||
Example live wiring once a real endpoint exists:
|
||||
|
||||
```bash
|
||||
python3 scripts/bezalel_gemma4_vps.py \
|
||||
--base-url https://<pod-id>-11434.proxy.runpod.net \
|
||||
--config-path /root/wizards/bezalel/home/config.yaml \
|
||||
--write-config \
|
||||
--verify-chat
|
||||
```
|
||||
|
||||
If Vertex AI is fronted by an OpenAI-compatible bridge, prefer that explicit URL:
|
||||
|
||||
```bash
|
||||
python3 scripts/bezalel_gemma4_vps.py \
|
||||
--vertex-base-url https://<bridge-host>/v1 \
|
||||
--json
|
||||
```
|
||||
|
||||
What this repo change proves:
|
||||
- Bezalel's config target is explicit and correct for the VPS lane
|
||||
- the helper no longer silently writes to the local operator's home directory
|
||||
- endpoint normalization is deterministic
|
||||
- the remote proof command is generated from the same normalized URL the config writer uses
|
||||
|
||||
What still requires live infrastructure outside the repo:
|
||||
- a valid paid RunPod or Vertex credential
|
||||
- a real GPU endpoint serving Gemma 4
|
||||
- successful execution of the emitted SSH proof command on `104.131.15.18`
|
||||
- successful Bezalel Hermes chat against that live endpoint
|
||||
@@ -8,14 +8,12 @@ Safe by default:
|
||||
- can call the RunPod GraphQL API if a key is provided and --apply-runpod is used
|
||||
- can update a Hermes config file in-place when --write-config is used
|
||||
- can verify an OpenAI-compatible endpoint with a lightweight chat probe
|
||||
- emits the exact Bezalel VPS curl proof command for remote verification
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import shlex
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
from urllib import request
|
||||
@@ -29,9 +27,7 @@ DEFAULT_IMAGE = "ollama/ollama:latest"
|
||||
DEFAULT_MODEL = "gemma4:latest"
|
||||
DEFAULT_PROVIDER_NAME = "Big Brain"
|
||||
DEFAULT_TOKEN_FILE = Path.home() / ".config" / "runpod" / "access_key"
|
||||
DEFAULT_CONFIG_PATH = Path("/root/wizards/bezalel/home/config.yaml")
|
||||
DEFAULT_BEZALEL_VPS_HOST = "104.131.15.18"
|
||||
DEFAULT_VERIFY_PROMPT = "Say READY"
|
||||
DEFAULT_CONFIG_PATH = Path.home() / "wizards" / "bezalel" / "home" / "config.yaml"
|
||||
|
||||
|
||||
def build_deploy_mutation(
|
||||
@@ -67,31 +63,8 @@ mutation {{
|
||||
'''.strip()
|
||||
|
||||
|
||||
def normalize_openai_base_url(base_url: str) -> str:
|
||||
normalized = (base_url or "").strip().rstrip("/")
|
||||
if not normalized:
|
||||
return normalized
|
||||
for suffix in ("/chat/completions", "/models"):
|
||||
if normalized.endswith(suffix):
|
||||
normalized = normalized[: -len(suffix)]
|
||||
break
|
||||
if not normalized.endswith("/v1"):
|
||||
normalized = f"{normalized}/v1"
|
||||
return normalized
|
||||
|
||||
|
||||
def build_runpod_endpoint(pod_id: str, port: int = 11434) -> str:
|
||||
return normalize_openai_base_url(f"https://{pod_id}-{port}.proxy.runpod.net")
|
||||
|
||||
|
||||
def resolve_base_url(*, vertex_base_url: str | None = None, base_url: str | None = None, pod_id: str | None = None) -> tuple[str | None, str | None]:
|
||||
if vertex_base_url:
|
||||
return normalize_openai_base_url(vertex_base_url), "vertex_base_url"
|
||||
if base_url:
|
||||
return normalize_openai_base_url(base_url), "base_url"
|
||||
if pod_id:
|
||||
return build_runpod_endpoint(pod_id), "pod_id"
|
||||
return None, None
|
||||
return f"https://{pod_id}-{port}.proxy.runpod.net/v1"
|
||||
|
||||
|
||||
def parse_deploy_response(payload: dict[str, Any]) -> dict[str, str]:
|
||||
@@ -129,7 +102,7 @@ def update_config_text(config_text: str, *, base_url: str, model: str = DEFAULT_
|
||||
|
||||
replacement = {
|
||||
"name": provider_name,
|
||||
"base_url": normalize_openai_base_url(base_url),
|
||||
"base_url": base_url,
|
||||
"api_key": "",
|
||||
"model": model,
|
||||
}
|
||||
@@ -156,8 +129,7 @@ def write_config_file(config_path: Path, *, base_url: str, model: str = DEFAULT_
|
||||
return updated
|
||||
|
||||
|
||||
def verify_openai_chat(base_url: str, *, model: str = DEFAULT_MODEL, prompt: str = DEFAULT_VERIFY_PROMPT) -> str:
|
||||
base_url = normalize_openai_base_url(base_url)
|
||||
def verify_openai_chat(base_url: str, *, model: str = DEFAULT_MODEL, prompt: str = "Say READY") -> str:
|
||||
payload = json.dumps(
|
||||
{
|
||||
"model": model,
|
||||
@@ -167,7 +139,7 @@ def verify_openai_chat(base_url: str, *, model: str = DEFAULT_MODEL, prompt: str
|
||||
}
|
||||
).encode()
|
||||
req = request.Request(
|
||||
f"{base_url}/chat/completions",
|
||||
f"{base_url.rstrip('/')}/chat/completions",
|
||||
data=payload,
|
||||
headers={"Content-Type": "application/json"},
|
||||
method="POST",
|
||||
@@ -177,30 +149,6 @@ def verify_openai_chat(base_url: str, *, model: str = DEFAULT_MODEL, prompt: str
|
||||
return data["choices"][0]["message"]["content"]
|
||||
|
||||
|
||||
def build_vps_verify_command(
|
||||
*,
|
||||
base_url: str,
|
||||
model: str = DEFAULT_MODEL,
|
||||
prompt: str = DEFAULT_VERIFY_PROMPT,
|
||||
vps_host: str = DEFAULT_BEZALEL_VPS_HOST,
|
||||
) -> str:
|
||||
payload = json.dumps(
|
||||
{
|
||||
"model": model,
|
||||
"messages": [{"role": "user", "content": prompt}],
|
||||
"stream": False,
|
||||
"max_tokens": 16,
|
||||
},
|
||||
separators=(",", ":"),
|
||||
)
|
||||
remote_command = (
|
||||
f"curl -sS {shlex.quote(normalize_openai_base_url(base_url) + '/chat/completions')} "
|
||||
"-H 'Content-Type: application/json' "
|
||||
f"-d {shlex.quote(payload)}"
|
||||
)
|
||||
return f"ssh root@{vps_host} {shlex.quote(remote_command)}"
|
||||
|
||||
|
||||
def parse_args() -> argparse.Namespace:
|
||||
parser = argparse.ArgumentParser(description="Provision a RunPod Gemma 4 endpoint and wire a Hermes config for Bezalel.")
|
||||
parser.add_argument("--pod-name", default="bezalel-gemma4")
|
||||
@@ -212,8 +160,6 @@ def parse_args() -> argparse.Namespace:
|
||||
parser.add_argument("--config-path", type=Path, default=DEFAULT_CONFIG_PATH)
|
||||
parser.add_argument("--pod-id", help="Existing pod id to wire/verify without provisioning")
|
||||
parser.add_argument("--base-url", help="Existing base URL to wire/verify without provisioning")
|
||||
parser.add_argument("--vertex-base-url", help="OpenAI-compatible Vertex bridge URL; takes precedence over --base-url and --pod-id")
|
||||
parser.add_argument("--vps-host", default=DEFAULT_BEZALEL_VPS_HOST, help="Bezalel VPS host for the remote curl proof command")
|
||||
parser.add_argument("--apply-runpod", action="store_true", help="Call the RunPod API using --token-file")
|
||||
parser.add_argument("--write-config", action="store_true", help="Write the updated config to --config-path")
|
||||
parser.add_argument("--verify-chat", action="store_true", help="Call the OpenAI-compatible chat endpoint")
|
||||
@@ -229,18 +175,13 @@ def main() -> None:
|
||||
"cloud_type": args.cloud_type,
|
||||
"model": args.model,
|
||||
"provider_name": args.provider_name,
|
||||
"config_path": str(args.config_path),
|
||||
"vps_host": args.vps_host,
|
||||
"actions": [],
|
||||
}
|
||||
|
||||
base_url, base_url_source = resolve_base_url(
|
||||
vertex_base_url=args.vertex_base_url,
|
||||
base_url=args.base_url,
|
||||
pod_id=args.pod_id,
|
||||
)
|
||||
if base_url_source:
|
||||
summary["actions"].append(f"resolved_base_url_from_{base_url_source}")
|
||||
base_url = args.base_url
|
||||
if not base_url and args.pod_id:
|
||||
base_url = build_runpod_endpoint(args.pod_id)
|
||||
summary["actions"].append("computed_base_url_from_pod_id")
|
||||
|
||||
if args.apply_runpod:
|
||||
if not args.token_file.exists():
|
||||
@@ -255,17 +196,12 @@ def main() -> None:
|
||||
base_url = build_runpod_endpoint("<pod-id>")
|
||||
summary["actions"].append("using_placeholder_base_url")
|
||||
|
||||
summary["base_url"] = normalize_openai_base_url(base_url)
|
||||
summary["base_url"] = base_url
|
||||
summary["config_preview"] = update_config_text("", base_url=base_url, model=args.model, provider_name=args.provider_name)
|
||||
summary["vps_verify_command"] = build_vps_verify_command(
|
||||
base_url=base_url,
|
||||
model=args.model,
|
||||
prompt=DEFAULT_VERIFY_PROMPT,
|
||||
vps_host=args.vps_host,
|
||||
)
|
||||
|
||||
if args.write_config:
|
||||
write_config_file(args.config_path, base_url=base_url, model=args.model, provider_name=args.provider_name)
|
||||
summary["config_path"] = str(args.config_path)
|
||||
summary["actions"].append("wrote_config")
|
||||
|
||||
if args.verify_chat:
|
||||
@@ -278,10 +214,8 @@ def main() -> None:
|
||||
|
||||
print("--- Bezalel Gemma4 RunPod Wiring ---")
|
||||
print(f"Pod name: {args.pod_name}")
|
||||
print(f"Base URL: {summary['base_url']}")
|
||||
print(f"Base URL: {base_url}")
|
||||
print(f"Model: {args.model}")
|
||||
print(f"Config target: {args.config_path}")
|
||||
print(f"Bezalel VPS proof: {summary['vps_verify_command']}")
|
||||
if args.write_config:
|
||||
print(f"Config written: {args.config_path}")
|
||||
if "verify_response" in summary:
|
||||
|
||||
406
scripts/burn_velocity_tracker.py
Normal file
406
scripts/burn_velocity_tracker.py
Normal file
@@ -0,0 +1,406 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Burn-down velocity tracker for Timmy Foundation issue throughput.
|
||||
|
||||
Refs: timmy-home #519
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import json
|
||||
from datetime import date, datetime, time, timedelta, timezone
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
from urllib import parse, request
|
||||
from base64 import b64encode
|
||||
|
||||
DEFAULT_BASE_URL = "https://forge.alexanderwhitestone.com/api/v1"
|
||||
DEFAULT_OWNER = "Timmy_Foundation"
|
||||
DEFAULT_TOKEN_FILE = Path.home() / ".config" / "gitea" / "token"
|
||||
DEFAULT_CONFIG_FILE = Path(__file__).resolve().parent.parent / "configs" / "burn_velocity_repos.json"
|
||||
DEFAULT_OUTPUT_DIR = Path.home() / ".timmy" / "burn-velocity"
|
||||
DEFAULT_OUTPUT_JSON = DEFAULT_OUTPUT_DIR / "latest.json"
|
||||
DEFAULT_OUTPUT_MD = DEFAULT_OUTPUT_DIR / "latest.md"
|
||||
DEFAULT_HISTORY_FILE = DEFAULT_OUTPUT_DIR / "history.json"
|
||||
DEFAULT_CONFIG = {
|
||||
"owner": DEFAULT_OWNER,
|
||||
"repos": ["timmy-home", "timmy-config", "fleet-ops", "the-beacon", "the-door", "the-nexus"],
|
||||
"lookback_days": 14,
|
||||
"alert": {
|
||||
"recent_days": 7,
|
||||
"baseline_days": 7,
|
||||
"minimum_baseline_closed": 4,
|
||||
"drop_ratio": 0.6,
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def parse_iso8601(value: str | None) -> datetime | None:
|
||||
if not value:
|
||||
return None
|
||||
normalized = value.replace("Z", "+00:00")
|
||||
parsed = datetime.fromisoformat(normalized)
|
||||
if parsed.tzinfo is None:
|
||||
return parsed.replace(tzinfo=timezone.utc)
|
||||
return parsed.astimezone(timezone.utc)
|
||||
|
||||
|
||||
def normalize_today(value: str | date | None = None) -> date:
|
||||
if value is None:
|
||||
return datetime.now(timezone.utc).date()
|
||||
if isinstance(value, date):
|
||||
return value
|
||||
return date.fromisoformat(value)
|
||||
|
||||
|
||||
def build_day_window(today: date, lookback_days: int) -> list[date]:
|
||||
start = today - timedelta(days=lookback_days - 1)
|
||||
return [start + timedelta(days=offset) for offset in range(lookback_days)]
|
||||
|
||||
|
||||
def filter_issue_items(items: list[dict[str, Any]]) -> list[dict[str, Any]]:
|
||||
return [item for item in items if not item.get("pull_request")]
|
||||
|
||||
|
||||
def build_daily_series(items: list[dict[str, Any]], today: date, lookback_days: int) -> list[dict[str, int | str]]:
|
||||
days = build_day_window(today, lookback_days)
|
||||
counts = {day.isoformat(): {"opened": 0, "closed": 0} for day in days}
|
||||
start_day = days[0]
|
||||
|
||||
for item in filter_issue_items(items):
|
||||
created_at = parse_iso8601(item.get("created_at"))
|
||||
if created_at is not None:
|
||||
created_day = created_at.date()
|
||||
if start_day <= created_day <= today:
|
||||
counts[created_day.isoformat()]["opened"] += 1
|
||||
|
||||
closed_at = parse_iso8601(item.get("closed_at"))
|
||||
if closed_at is not None:
|
||||
closed_day = closed_at.date()
|
||||
if start_day <= closed_day <= today:
|
||||
counts[closed_day.isoformat()]["closed"] += 1
|
||||
|
||||
return [
|
||||
{
|
||||
"date": day.isoformat(),
|
||||
"opened": counts[day.isoformat()]["opened"],
|
||||
"closed": counts[day.isoformat()]["closed"],
|
||||
}
|
||||
for day in days
|
||||
]
|
||||
|
||||
|
||||
def summarize_velocity_alert(
|
||||
*, recent_closed: int, baseline_closed: int, open_now: int, config: dict[str, Any]
|
||||
) -> dict[str, Any]:
|
||||
minimum_baseline = int(config.get("minimum_baseline_closed", 4))
|
||||
drop_ratio = float(config.get("drop_ratio", 0.6))
|
||||
|
||||
if baseline_closed >= minimum_baseline and recent_closed < baseline_closed * drop_ratio:
|
||||
return {
|
||||
"status": "drop",
|
||||
"kind": "velocity_drop",
|
||||
"recent_closed": recent_closed,
|
||||
"baseline_closed": baseline_closed,
|
||||
"reason": (
|
||||
f"velocity_drop: closed {recent_closed} in the last {config.get('recent_days', 7)}d "
|
||||
f"vs {baseline_closed} in the prior {config.get('baseline_days', 7)}d"
|
||||
),
|
||||
}
|
||||
|
||||
if open_now > 0 and baseline_closed >= minimum_baseline and recent_closed == 0:
|
||||
return {
|
||||
"status": "drop",
|
||||
"kind": "velocity_drop",
|
||||
"recent_closed": recent_closed,
|
||||
"baseline_closed": baseline_closed,
|
||||
"reason": "velocity_drop: no issues closed in the recent window while backlog is still open",
|
||||
}
|
||||
|
||||
return {
|
||||
"status": "ok",
|
||||
"kind": "none",
|
||||
"recent_closed": recent_closed,
|
||||
"baseline_closed": baseline_closed,
|
||||
"reason": "velocity stable",
|
||||
}
|
||||
|
||||
|
||||
def _sum_window(daily: list[dict[str, int | str]], field: str, days: int) -> int:
|
||||
if days <= 0:
|
||||
return 0
|
||||
return sum(int(item[field]) for item in daily[-days:])
|
||||
|
||||
|
||||
def _sum_baseline_window(daily: list[dict[str, int | str]], recent_days: int, baseline_days: int) -> int:
|
||||
if baseline_days <= 0:
|
||||
return 0
|
||||
if recent_days <= 0:
|
||||
return sum(int(item["closed"]) for item in daily[-baseline_days:])
|
||||
baseline_slice = daily[-(recent_days + baseline_days) : -recent_days]
|
||||
return sum(int(item["closed"]) for item in baseline_slice)
|
||||
|
||||
|
||||
def build_velocity_report(config: dict[str, Any], snapshot: dict[str, Any], today: str | date | None = None) -> dict[str, Any]:
|
||||
report_day = normalize_today(today)
|
||||
generated_at = snapshot.get("generated_at") or datetime.now(timezone.utc).isoformat().replace("+00:00", "Z")
|
||||
owner = config.get("owner", DEFAULT_OWNER)
|
||||
repos = list(config.get("repos") or sorted((snapshot.get("repos") or {}).keys()))
|
||||
lookback_days = int(config.get("lookback_days", 14))
|
||||
alert_config = dict(DEFAULT_CONFIG["alert"])
|
||||
alert_config.update(config.get("alert") or {})
|
||||
recent_days = int(alert_config.get("recent_days", 7))
|
||||
baseline_days = int(alert_config.get("baseline_days", 7))
|
||||
|
||||
repo_reports: list[dict[str, Any]] = []
|
||||
total_open_now = 0
|
||||
total_closed_last_7d = 0
|
||||
repos_with_alerts: list[str] = []
|
||||
|
||||
for repo_name in repos:
|
||||
repo_snapshot = (snapshot.get("repos") or {}).get(repo_name, {})
|
||||
open_issues = filter_issue_items(list(repo_snapshot.get("open_issues") or []))
|
||||
recent_issues = filter_issue_items(list(repo_snapshot.get("recent_issues") or []))
|
||||
daily = build_daily_series(recent_issues, report_day, lookback_days)
|
||||
|
||||
open_now = len(open_issues)
|
||||
opened_last_7d = _sum_window(daily, "opened", recent_days)
|
||||
closed_last_7d = _sum_window(daily, "closed", recent_days)
|
||||
baseline_closed = _sum_baseline_window(daily, recent_days, baseline_days)
|
||||
weekly_net = opened_last_7d - closed_last_7d
|
||||
alert = summarize_velocity_alert(
|
||||
recent_closed=closed_last_7d,
|
||||
baseline_closed=baseline_closed,
|
||||
open_now=open_now,
|
||||
config=alert_config,
|
||||
)
|
||||
|
||||
repo_report = {
|
||||
"repo": repo_name,
|
||||
"open_now": open_now,
|
||||
"opened_last_7d": opened_last_7d,
|
||||
"closed_last_7d": closed_last_7d,
|
||||
"baseline_closed": baseline_closed,
|
||||
"weekly_net": weekly_net,
|
||||
"daily": daily,
|
||||
"alert": alert,
|
||||
}
|
||||
repo_reports.append(repo_report)
|
||||
|
||||
total_open_now += open_now
|
||||
total_closed_last_7d += closed_last_7d
|
||||
if alert["status"] != "ok":
|
||||
repos_with_alerts.append(repo_name)
|
||||
|
||||
return {
|
||||
"owner": owner,
|
||||
"generated_at": generated_at,
|
||||
"generated_day": report_day.isoformat(),
|
||||
"lookback_days": lookback_days,
|
||||
"dashboard_contract_version": 1,
|
||||
"repos": repo_reports,
|
||||
"summary": {
|
||||
"total_open_now": total_open_now,
|
||||
"total_closed_last_7d": total_closed_last_7d,
|
||||
"repos_with_alerts": repos_with_alerts,
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def render_markdown(report: dict[str, Any]) -> str:
|
||||
lines = [
|
||||
"# Burn-down Velocity Tracking",
|
||||
"",
|
||||
f"Generated: {report['generated_at']}",
|
||||
f"Owner: {report['owner']}",
|
||||
f"Lookback days: {report['lookback_days']}",
|
||||
"",
|
||||
"## Per-repo velocity",
|
||||
"",
|
||||
"| Repo | Open now | Opened 7d | Closed 7d | Previous 7d | Alert |",
|
||||
"| --- | ---: | ---: | ---: | ---: | --- |",
|
||||
]
|
||||
|
||||
for repo in report["repos"]:
|
||||
alert_label = repo["alert"]["kind"] if repo["alert"]["status"] != "ok" else "ok"
|
||||
lines.append(
|
||||
f"| {repo['repo']} | {repo['open_now']} | {repo['opened_last_7d']} | {repo['closed_last_7d']} | {repo['baseline_closed']} | {alert_label} |"
|
||||
)
|
||||
|
||||
lines.extend(
|
||||
[
|
||||
"",
|
||||
"## Dashboard handoff for timmy-config",
|
||||
"",
|
||||
"The timmy-config dashboard should consume `~/.timmy/burn-velocity/latest.json` and render, for each repo:",
|
||||
"- `open_now`",
|
||||
"- `opened_last_7d`",
|
||||
"- `closed_last_7d`",
|
||||
"- `baseline_closed`",
|
||||
"- `alert.status` / `alert.kind` / `alert.reason`",
|
||||
"",
|
||||
"Cron should also persist `~/.timmy/burn-velocity/history.json` so timmy-config can plot the daily trend line instead of only the latest snapshot.",
|
||||
"",
|
||||
"## Alerts",
|
||||
"",
|
||||
]
|
||||
)
|
||||
|
||||
alerts = [repo for repo in report["repos"] if repo["alert"]["status"] != "ok"]
|
||||
if not alerts:
|
||||
lines.append("- none")
|
||||
else:
|
||||
for repo in alerts:
|
||||
lines.append(f"- {repo['repo']}: {repo['alert']['reason']}")
|
||||
|
||||
return "\n".join(lines) + "\n"
|
||||
|
||||
|
||||
def update_history(history_path: Path, report: dict[str, Any]) -> dict[str, Any]:
|
||||
if history_path.exists():
|
||||
history = json.loads(history_path.read_text(encoding="utf-8"))
|
||||
else:
|
||||
history = {"days": []}
|
||||
|
||||
entry = {
|
||||
"date": report["generated_day"],
|
||||
"generated_at": report["generated_at"],
|
||||
"summary": report["summary"],
|
||||
"repos": report["repos"],
|
||||
}
|
||||
|
||||
retained = [item for item in history.get("days", []) if item.get("date") != report["generated_day"]]
|
||||
retained.append(entry)
|
||||
retained.sort(key=lambda item: item["date"])
|
||||
history["days"] = retained
|
||||
|
||||
history_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
history_path.write_text(json.dumps(history, indent=2), encoding="utf-8")
|
||||
return history
|
||||
|
||||
|
||||
class GiteaClient:
|
||||
def __init__(self, token: str, owner: str = DEFAULT_OWNER, base_url: str = DEFAULT_BASE_URL):
|
||||
self.token = token
|
||||
self.owner = owner
|
||||
self.base_url = base_url.rstrip("/")
|
||||
|
||||
def _headers(self) -> list[dict[str, str]]:
|
||||
return [
|
||||
{"Authorization": f"token {self.token}", "Accept": "application/json"},
|
||||
{
|
||||
"Authorization": "Basic " + b64encode(f"{self.token}:".encode()).decode(),
|
||||
"Accept": "application/json",
|
||||
},
|
||||
]
|
||||
|
||||
def _request_json(self, url: str) -> list[dict[str, Any]]:
|
||||
last_error: Exception | None = None
|
||||
for headers in self._headers():
|
||||
try:
|
||||
req = request.Request(url, headers=headers)
|
||||
with request.urlopen(req, timeout=30) as response:
|
||||
return json.loads(response.read().decode())
|
||||
except Exception as exc: # pragma: no cover - exercised only on live API failure
|
||||
last_error = exc
|
||||
if last_error is None: # pragma: no cover - defensive
|
||||
raise RuntimeError("request failed without an exception")
|
||||
raise last_error
|
||||
|
||||
def list_issues(self, repo: str, *, state: str, since: str | None = None) -> list[dict[str, Any]]:
|
||||
issues: list[dict[str, Any]] = []
|
||||
page = 1
|
||||
while True:
|
||||
query = {"state": state, "type": "issues", "limit": 100, "page": page}
|
||||
if since:
|
||||
query["since"] = since
|
||||
url = f"{self.base_url}/repos/{self.owner}/{repo}/issues?{parse.urlencode(query)}"
|
||||
batch = self._request_json(url)
|
||||
if not batch:
|
||||
break
|
||||
issues.extend(filter_issue_items(batch))
|
||||
page += 1
|
||||
return issues
|
||||
|
||||
|
||||
def load_json(path: Path, default: Any) -> Any:
|
||||
if not path.exists():
|
||||
return default
|
||||
return json.loads(path.read_text(encoding="utf-8"))
|
||||
|
||||
|
||||
def load_config(path: Path) -> dict[str, Any]:
|
||||
config = dict(DEFAULT_CONFIG)
|
||||
alert = dict(DEFAULT_CONFIG["alert"])
|
||||
raw = load_json(path, {})
|
||||
config.update(raw)
|
||||
alert.update(raw.get("alert") or {})
|
||||
config["alert"] = alert
|
||||
return config
|
||||
|
||||
|
||||
def collect_live_snapshot(
|
||||
config: dict[str, Any], *, today: str | date | None = None, token_file: Path = DEFAULT_TOKEN_FILE, base_url: str = DEFAULT_BASE_URL
|
||||
) -> dict[str, Any]:
|
||||
token = token_file.read_text(encoding="utf-8").strip()
|
||||
report_day = normalize_today(today)
|
||||
since_day = report_day - timedelta(days=int(config.get("lookback_days", 14)) - 1)
|
||||
since_timestamp = datetime.combine(since_day, time.min, tzinfo=timezone.utc).isoformat().replace("+00:00", "Z")
|
||||
client = GiteaClient(token=token, owner=config.get("owner", DEFAULT_OWNER), base_url=base_url)
|
||||
|
||||
repos = list(config.get("repos") or [])
|
||||
repo_payload = {}
|
||||
for repo in repos:
|
||||
repo_payload[repo] = {
|
||||
"open_issues": client.list_issues(repo, state="open"),
|
||||
"recent_issues": client.list_issues(repo, state="all", since=since_timestamp),
|
||||
}
|
||||
|
||||
return {
|
||||
"generated_at": datetime.now(timezone.utc).isoformat().replace("+00:00", "Z"),
|
||||
"repos": repo_payload,
|
||||
}
|
||||
|
||||
|
||||
def parse_args() -> argparse.Namespace:
|
||||
parser = argparse.ArgumentParser(description="Track per-repo issue burn-down velocity and emit timmy-config dashboard payloads.")
|
||||
parser.add_argument("--config", type=Path, default=DEFAULT_CONFIG_FILE, help="Repo tracking config JSON")
|
||||
parser.add_argument("--snapshot-file", type=Path, help="Use a pre-fetched snapshot JSON instead of calling Gitea")
|
||||
parser.add_argument("--token-file", type=Path, default=DEFAULT_TOKEN_FILE, help="Gitea token file for live collection")
|
||||
parser.add_argument("--base-url", default=DEFAULT_BASE_URL, help="Gitea API base URL")
|
||||
parser.add_argument("--today", help="Override report date (YYYY-MM-DD)")
|
||||
parser.add_argument("--output-json", type=Path, default=DEFAULT_OUTPUT_JSON, help="Path for latest JSON payload")
|
||||
parser.add_argument("--output-md", type=Path, default=DEFAULT_OUTPUT_MD, help="Path for latest markdown summary")
|
||||
parser.add_argument("--history-file", type=Path, default=DEFAULT_HISTORY_FILE, help="Path for persisted daily history JSON")
|
||||
parser.add_argument("--write-history", action="store_true", help="Update the daily history file after generating the report")
|
||||
parser.add_argument("--json", action="store_true", help="Print JSON instead of markdown to stdout")
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def main() -> None:
|
||||
args = parse_args()
|
||||
config = load_config(args.config)
|
||||
|
||||
if args.snapshot_file:
|
||||
snapshot = load_json(args.snapshot_file, {"repos": {}})
|
||||
else:
|
||||
snapshot = collect_live_snapshot(config, today=args.today, token_file=args.token_file, base_url=args.base_url)
|
||||
|
||||
report = build_velocity_report(config, snapshot, today=args.today)
|
||||
|
||||
args.output_json.parent.mkdir(parents=True, exist_ok=True)
|
||||
args.output_md.parent.mkdir(parents=True, exist_ok=True)
|
||||
args.output_json.write_text(json.dumps(report, indent=2), encoding="utf-8")
|
||||
args.output_md.write_text(render_markdown(report), encoding="utf-8")
|
||||
|
||||
if args.write_history:
|
||||
update_history(args.history_file, report)
|
||||
|
||||
if args.json:
|
||||
print(json.dumps(report, indent=2))
|
||||
else:
|
||||
print(render_markdown(report))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,20 +1,14 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
from unittest.mock import patch
|
||||
|
||||
import yaml
|
||||
|
||||
from scripts.bezalel_gemma4_vps import (
|
||||
DEFAULT_CONFIG_PATH,
|
||||
DEFAULT_BEZALEL_VPS_HOST,
|
||||
build_deploy_mutation,
|
||||
build_runpod_endpoint,
|
||||
build_vps_verify_command,
|
||||
normalize_openai_base_url,
|
||||
parse_deploy_response,
|
||||
resolve_base_url,
|
||||
update_config_text,
|
||||
verify_openai_chat,
|
||||
)
|
||||
@@ -34,10 +28,6 @@ class _FakeResponse:
|
||||
return False
|
||||
|
||||
|
||||
def test_default_config_path_targets_bezalel_vps_root_config() -> None:
|
||||
assert DEFAULT_CONFIG_PATH == Path("/root/wizards/bezalel/home/config.yaml")
|
||||
|
||||
|
||||
def test_build_deploy_mutation_uses_ollama_image_and_openai_port() -> None:
|
||||
query = build_deploy_mutation(name="bezalel-gemma4", gpu_type="NVIDIA L40S", model_tag="gemma4:latest")
|
||||
|
||||
@@ -47,30 +37,6 @@ def test_build_deploy_mutation_uses_ollama_image_and_openai_port() -> None:
|
||||
assert 'volumeMountPath: "/root/.ollama"' in query
|
||||
|
||||
|
||||
def test_normalize_openai_base_url_adds_v1_suffix() -> None:
|
||||
assert normalize_openai_base_url("https://pod-11434.proxy.runpod.net") == "https://pod-11434.proxy.runpod.net/v1"
|
||||
|
||||
|
||||
def test_normalize_openai_base_url_trims_chat_completions_suffix() -> None:
|
||||
assert normalize_openai_base_url("https://pod-11434.proxy.runpod.net/v1/chat/completions") == "https://pod-11434.proxy.runpod.net/v1"
|
||||
|
||||
|
||||
def test_resolve_base_url_prefers_vertex_over_base_and_pod_id() -> None:
|
||||
base_url, source = resolve_base_url(
|
||||
vertex_base_url="https://vertex.example.com/openai",
|
||||
base_url="https://plain.example.com",
|
||||
pod_id="abc123",
|
||||
)
|
||||
assert source == "vertex_base_url"
|
||||
assert base_url == "https://vertex.example.com/openai/v1"
|
||||
|
||||
|
||||
def test_resolve_base_url_falls_back_to_base_url_before_pod_id() -> None:
|
||||
base_url, source = resolve_base_url(base_url="https://plain.example.com", pod_id="abc123")
|
||||
assert source == "base_url"
|
||||
assert base_url == "https://plain.example.com/v1"
|
||||
|
||||
|
||||
def test_build_runpod_endpoint_appends_v1_suffix() -> None:
|
||||
assert build_runpod_endpoint("abc123") == "https://abc123-11434.proxy.runpod.net/v1"
|
||||
|
||||
@@ -94,7 +60,7 @@ def test_parse_deploy_response_extracts_pod_id_and_endpoint() -> None:
|
||||
}
|
||||
|
||||
|
||||
def test_update_config_text_upserts_big_brain_provider_and_normalizes_base_url() -> None:
|
||||
def test_update_config_text_upserts_big_brain_provider() -> None:
|
||||
original = """
|
||||
model:
|
||||
default: kimi-k2.5
|
||||
@@ -106,7 +72,7 @@ custom_providers:
|
||||
model: gemma3:27b
|
||||
"""
|
||||
|
||||
updated = update_config_text(original, base_url="https://new-pod-11434.proxy.runpod.net", model="gemma4:latest")
|
||||
updated = update_config_text(original, base_url="https://new-pod-11434.proxy.runpod.net/v1", model="gemma4:latest")
|
||||
parsed = yaml.safe_load(updated)
|
||||
|
||||
assert parsed["model"] == {"default": "kimi-k2.5", "provider": "kimi-coding"}
|
||||
@@ -120,14 +86,7 @@ custom_providers:
|
||||
]
|
||||
|
||||
|
||||
def test_build_vps_verify_command_targets_bezalel_host_and_chat_completions() -> None:
|
||||
command = build_vps_verify_command(base_url="https://pod-11434.proxy.runpod.net", model="gemma4:latest")
|
||||
assert command.startswith(f"ssh root@{DEFAULT_BEZALEL_VPS_HOST} ")
|
||||
assert "/v1/chat/completions" in command
|
||||
assert "gemma4:latest" in command
|
||||
|
||||
|
||||
def test_verify_openai_chat_calls_chat_completions_with_normalized_base_url() -> None:
|
||||
def test_verify_openai_chat_calls_chat_completions() -> None:
|
||||
response_payload = {
|
||||
"choices": [
|
||||
{
|
||||
@@ -142,7 +101,7 @@ def test_verify_openai_chat_calls_chat_completions_with_normalized_base_url() ->
|
||||
"scripts.bezalel_gemma4_vps.request.urlopen",
|
||||
return_value=_FakeResponse(response_payload),
|
||||
) as mocked:
|
||||
result = verify_openai_chat("https://pod-11434.proxy.runpod.net", model="gemma4:latest", prompt="say READY")
|
||||
result = verify_openai_chat("https://pod-11434.proxy.runpod.net/v1", model="gemma4:latest", prompt="say READY")
|
||||
|
||||
assert result == "READY"
|
||||
req = mocked.call_args.args[0]
|
||||
@@ -150,10 +109,3 @@ def test_verify_openai_chat_calls_chat_completions_with_normalized_base_url() ->
|
||||
payload = json.loads(req.data.decode())
|
||||
assert payload["model"] == "gemma4:latest"
|
||||
assert payload["messages"][0]["content"] == "say READY"
|
||||
|
||||
|
||||
def test_readme_documents_root_config_path_and_vps_proof_command() -> None:
|
||||
readme = Path("scripts/README_bezalel_gemma4_vps.md").read_text()
|
||||
assert "/root/wizards/bezalel/home/config.yaml" in readme
|
||||
assert "ssh root@104.131.15.18" in readme
|
||||
assert "--vertex-base-url" in readme
|
||||
|
||||
176
tests/test_burn_velocity_tracker.py
Normal file
176
tests/test_burn_velocity_tracker.py
Normal file
@@ -0,0 +1,176 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
import sys
|
||||
from datetime import date
|
||||
from pathlib import Path
|
||||
|
||||
from scripts.burn_velocity_tracker import build_velocity_report, render_markdown, update_history
|
||||
|
||||
|
||||
ROOT = Path(__file__).resolve().parent.parent
|
||||
DOC_PATH = ROOT / "docs" / "BURN_VELOCITY_TRACKING.md"
|
||||
|
||||
|
||||
SNAPSHOT = {
|
||||
"generated_at": "2026-04-22T12:00:00Z",
|
||||
"repos": {
|
||||
"timmy-home": {
|
||||
"open_issues": [
|
||||
{"number": 501, "state": "open", "created_at": "2026-04-20T09:00:00Z"},
|
||||
{"number": 502, "state": "open", "created_at": "2026-04-22T07:00:00Z"},
|
||||
],
|
||||
"recent_issues": [
|
||||
{"number": 401, "state": "closed", "created_at": "2026-04-21T09:00:00Z", "closed_at": "2026-04-22T05:30:00Z"},
|
||||
{"number": 402, "state": "closed", "created_at": "2026-04-20T09:00:00Z", "closed_at": "2026-04-21T05:30:00Z"},
|
||||
{"number": 403, "state": "closed", "created_at": "2026-04-19T09:00:00Z", "closed_at": "2026-04-20T05:30:00Z"},
|
||||
{"number": 404, "state": "closed", "created_at": "2026-04-14T09:00:00Z", "closed_at": "2026-04-15T05:30:00Z"},
|
||||
{"number": 405, "state": "closed", "created_at": "2026-04-13T09:00:00Z", "closed_at": "2026-04-14T05:30:00Z"},
|
||||
{"number": 406, "state": "closed", "created_at": "2026-04-12T09:00:00Z", "closed_at": "2026-04-13T05:30:00Z"},
|
||||
{"number": 407, "state": "closed", "created_at": "2026-04-11T09:00:00Z", "closed_at": "2026-04-12T05:30:00Z"},
|
||||
{"number": 408, "state": "closed", "created_at": "2026-04-10T09:00:00Z", "closed_at": "2026-04-11T05:30:00Z"},
|
||||
{"number": 409, "state": "closed", "created_at": "2026-04-09T09:00:00Z", "closed_at": "2026-04-10T05:30:00Z"},
|
||||
{"number": 410, "state": "closed", "created_at": "2026-04-08T09:00:00Z", "closed_at": "2026-04-09T05:30:00Z"},
|
||||
{"number": 411, "state": "closed", "created_at": "2026-04-07T09:00:00Z", "closed_at": "2026-04-08T05:30:00Z"},
|
||||
{"number": 412, "state": "closed", "created_at": "2026-04-06T09:00:00Z", "closed_at": "2026-04-07T05:30:00Z"},
|
||||
{"number": 413, "state": "closed", "created_at": "2026-04-05T09:00:00Z", "closed_at": "2026-04-06T05:30:00Z"},
|
||||
{"number": 414, "state": "open", "created_at": "2026-04-22T08:45:00Z", "closed_at": None},
|
||||
{"number": 415, "state": "open", "created_at": "2026-04-17T08:45:00Z", "closed_at": None},
|
||||
],
|
||||
},
|
||||
"timmy-config": {
|
||||
"open_issues": [
|
||||
{"number": 601, "state": "open", "created_at": "2026-04-18T09:00:00Z"},
|
||||
],
|
||||
"recent_issues": [
|
||||
{"number": 602, "state": "closed", "created_at": "2026-04-20T09:00:00Z", "closed_at": "2026-04-21T06:00:00Z"},
|
||||
{"number": 603, "state": "open", "created_at": "2026-04-22T06:00:00Z", "closed_at": None},
|
||||
],
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
CONFIG = {
|
||||
"owner": "Timmy_Foundation",
|
||||
"repos": ["timmy-home", "timmy-config"],
|
||||
"lookback_days": 14,
|
||||
"alert": {
|
||||
"recent_days": 7,
|
||||
"baseline_days": 7,
|
||||
"minimum_baseline_closed": 4,
|
||||
"drop_ratio": 0.6,
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def test_build_velocity_report_counts_opened_closed_and_flags_drop_alert() -> None:
|
||||
report = build_velocity_report(CONFIG, SNAPSHOT, today=date(2026, 4, 22))
|
||||
|
||||
assert report["generated_day"] == "2026-04-22"
|
||||
assert report["summary"]["repos_with_alerts"] == ["timmy-home"]
|
||||
assert report["summary"]["total_open_now"] == 3
|
||||
|
||||
home = report["repos"][0]
|
||||
assert home["repo"] == "timmy-home"
|
||||
assert home["open_now"] == 2
|
||||
assert home["opened_last_7d"] == 5
|
||||
assert home["closed_last_7d"] == 3
|
||||
assert home["baseline_closed"] == 7
|
||||
assert home["weekly_net"] == 2
|
||||
assert home["alert"]["status"] == "drop"
|
||||
assert home["alert"]["recent_closed"] == 3
|
||||
assert home["daily"][-1] == {"date": "2026-04-22", "opened": 1, "closed": 1}
|
||||
|
||||
timmy_config = report["repos"][1]
|
||||
assert timmy_config["repo"] == "timmy-config"
|
||||
assert timmy_config["open_now"] == 1
|
||||
assert timmy_config["closed_last_7d"] == 1
|
||||
assert timmy_config["alert"]["status"] == "ok"
|
||||
|
||||
|
||||
def test_render_markdown_includes_dashboard_handoff_and_alerts() -> None:
|
||||
report = build_velocity_report(CONFIG, SNAPSHOT, today=date(2026, 4, 22))
|
||||
rendered = render_markdown(report)
|
||||
|
||||
for snippet in (
|
||||
"# Burn-down Velocity Tracking",
|
||||
"## Per-repo velocity",
|
||||
"timmy-home",
|
||||
"timmy-config",
|
||||
"## Dashboard handoff for timmy-config",
|
||||
"velocity_drop",
|
||||
"## Alerts",
|
||||
):
|
||||
assert snippet in rendered
|
||||
|
||||
|
||||
def test_update_history_replaces_same_day_snapshot(tmp_path: Path) -> None:
|
||||
history_path = tmp_path / "burn-velocity-history.json"
|
||||
report = build_velocity_report(CONFIG, SNAPSHOT, today=date(2026, 4, 22))
|
||||
update_history(history_path, report)
|
||||
|
||||
updated = json.loads(json.dumps(report))
|
||||
updated["repos"][0]["open_now"] = 9
|
||||
updated["summary"]["total_open_now"] = 10
|
||||
update_history(history_path, updated)
|
||||
|
||||
history = json.loads(history_path.read_text(encoding="utf-8"))
|
||||
assert [item["date"] for item in history["days"]] == ["2026-04-22"]
|
||||
assert history["days"][0]["summary"]["total_open_now"] == 10
|
||||
assert history["days"][0]["repos"][0]["open_now"] == 9
|
||||
|
||||
|
||||
def test_cli_writes_json_markdown_and_history_from_snapshot(tmp_path: Path) -> None:
|
||||
snapshot_path = tmp_path / "snapshot.json"
|
||||
output_json = tmp_path / "latest.json"
|
||||
output_md = tmp_path / "latest.md"
|
||||
history_path = tmp_path / "history.json"
|
||||
snapshot_path.write_text(json.dumps(SNAPSHOT), encoding="utf-8")
|
||||
|
||||
result = subprocess.run(
|
||||
[
|
||||
sys.executable,
|
||||
"-m",
|
||||
"scripts.burn_velocity_tracker",
|
||||
"--snapshot-file",
|
||||
str(snapshot_path),
|
||||
"--today",
|
||||
"2026-04-22",
|
||||
"--output-json",
|
||||
str(output_json),
|
||||
"--output-md",
|
||||
str(output_md),
|
||||
"--history-file",
|
||||
str(history_path),
|
||||
"--write-history",
|
||||
"--json",
|
||||
],
|
||||
check=True,
|
||||
cwd=ROOT,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
)
|
||||
|
||||
payload = json.loads(result.stdout)
|
||||
assert payload["summary"]["repos_with_alerts"] == ["timmy-home"]
|
||||
assert output_json.exists()
|
||||
assert output_md.exists()
|
||||
assert history_path.exists()
|
||||
assert "timmy-config" in output_md.read_text(encoding="utf-8")
|
||||
|
||||
|
||||
def test_repo_contains_burn_velocity_tracking_doc() -> None:
|
||||
text = DOC_PATH.read_text(encoding="utf-8")
|
||||
required = [
|
||||
"# Burn-down Velocity Tracking",
|
||||
"python3 scripts/burn_velocity_tracker.py",
|
||||
"configs/burn_velocity_repos.json",
|
||||
"~/.timmy/burn-velocity/latest.json",
|
||||
"timmy-config dashboard",
|
||||
"type=issues",
|
||||
"velocity_drop",
|
||||
]
|
||||
for snippet in required:
|
||||
assert snippet in text
|
||||
Reference in New Issue
Block a user