Compare commits
1 Commits
sprint/iss
...
step35/467
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
671ed86c5f |
142
SKILL-sov-bundle.md
Normal file
142
SKILL-sov-bundle.md
Normal file
@@ -0,0 +1,142 @@
|
||||
---
|
||||
name: sov-bundle-export-import
|
||||
category: data-export
|
||||
description: |
|
||||
Sovereign Bundle (.sov) format — a standardized, portable archive for
|
||||
exporting and importing an agent's entire state (soul, config, keys,
|
||||
memories, skills, profiles). Enables backup, migration, and sovereignty.
|
||||
---
|
||||
|
||||
# Sovereign Bundle Format (.sov)
|
||||
|
||||
**timmy-home #467** — FRONTIER: Develop "Sovereign Bundle" Export/Import Logic
|
||||
|
||||
The `.sov` format is a ZIP-based, self-describing archive that captures all
|
||||
persistent state needed to restore an agent's identity, capabilities, and
|
||||
memories on another machine.
|
||||
|
||||
## Format
|
||||
|
||||
```
|
||||
sov/
|
||||
├── META.json # Format identifier + environment metadata
|
||||
├── manifest.json # Bundle contents & component sizes (canonical index)
|
||||
├── soul/
|
||||
│ └── SOUL.md # Identity document, values, oath
|
||||
├── config/
|
||||
│ └── config.yaml # Agent configuration, providers, toolsets
|
||||
├── keys/
|
||||
│ └── keymaxxing.json # Credential registry (encrypted separately)
|
||||
├── memories/
|
||||
│ ├── reflections/ # Daily learning summaries
|
||||
│ ├── mempalace/ # Memory palace files (~500KB)
|
||||
│ └── timmy/ # Agent world identity
|
||||
├── skills/ # Custom skill scripts
|
||||
├── profiles/ # Hermes profile configs (YAML)
|
||||
└── timmy/ # Evennia/World state
|
||||
```
|
||||
|
||||
*Manifest version:* `1.0`
|
||||
*Filename suffix:* `.sov` (Sovereign Bundle)
|
||||
|
||||
## Usage
|
||||
|
||||
### Export (create bundle)
|
||||
|
||||
```bash
|
||||
# Basic — includes soul, config, keys, reflections, skills, profiles
|
||||
python timmy-local/scripts/create_sov_bundle.py export -o my-agent.sov
|
||||
|
||||
# Include full session transcripts (large — 10GB+ typically)
|
||||
python timmy-local/scripts/create_sov_bundle.py export \
|
||||
--include-sessions -o full-backup.sov
|
||||
|
||||
# From a specific HERMES_HOME
|
||||
HERMES_HOME=/path/to/.hermes python timmy-local/scripts/create_sov_bundle.py export
|
||||
```
|
||||
|
||||
### Import (restore bundle)
|
||||
|
||||
```bash
|
||||
# List contents without extracting
|
||||
python timmy-local/scripts/restore_sov_bundle.py --list my-agent.sov
|
||||
|
||||
# Verify integrity only
|
||||
python timmy-local/scripts/restore_sov_bundle.py verify my-agent.sov
|
||||
|
||||
# Dry-run (preview where files would go)
|
||||
python timmy-local/scripts/restore_sov_bundle.py my-agent.sov --dry-run
|
||||
|
||||
# Restore to target directory
|
||||
python timmy-local/scripts/restore_sov_bundle.py my-agent.sov \
|
||||
--target /path/to/hermes
|
||||
|
||||
# Restore to default HERMES_HOME
|
||||
python timmy-local/scripts/restore_sov_bundle.py my-agent.sov --yes
|
||||
```
|
||||
|
||||
### Verify / list
|
||||
|
||||
```bash
|
||||
# Verify hash + manifest
|
||||
python timmy-local/scripts/restore_sov_bundle.py verify my-agent.sov
|
||||
|
||||
# List archives
|
||||
python timmy-local/scripts/restore_sov_bundle.py --list my-agent.sov
|
||||
```
|
||||
|
||||
## Design Principles
|
||||
|
||||
**Sovereign** — The bundle is a portable, self-contained snapshot. No
|
||||
third-party service required to read or write it.
|
||||
|
||||
**Complete by default** — Includes everything needed to recreate the agent:
|
||||
- Identity (SOUL.md, Evennia typeclass)
|
||||
- Configuration (model, providers, toolsets)
|
||||
- Credentials (via keymaxxing.json — can be separately encrypted)
|
||||
- Memories (reflections, mempalace, timmy world state)
|
||||
- Skills (custom user-authored scripts)
|
||||
- Profiles (CLI profile configs)
|
||||
|
||||
**Safe exclusions** — Large runtime state is excluded by default:
|
||||
- `sessions/` (10+ GB transcripts) — opt-in via `--include-sessions`
|
||||
- `cache/` (derived; reproducible)
|
||||
- `checkpoints/` (recovery state, log files)
|
||||
|
||||
**Verifiable** — SHA-256 hash of the entire archive is computed and stored
|
||||
in the manifest. Integrity can be checked without extracting.
|
||||
|
||||
**Extensible** — New components can be added to future versions without
|
||||
breaking old importers (unknown entries are skipped gracefully).
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
- Core code: `timmy-local/scripts/create_sov_bundle.py`, `restore_sov_bundle.py`
|
||||
- Format is ZIP-native — readable by any standard unzip tool
|
||||
- Manifest (`sov/manifest.json`) tracks component-level sizes for quick diffing
|
||||
- `sov/META.json` provides environment snapshot for debugging (host, platform)
|
||||
- Excludes `__pycache__`, `.venv`, `.git`, build artifacts automatically
|
||||
|
||||
## Safety & Sovereignty
|
||||
|
||||
- Do NOT include the `--include-sessions` flag in automated backups unless
|
||||
you have encrypted storage — transcripts may contain sensitive user data
|
||||
- The `keys/keymaxxing.json` file contains credential registry — consider
|
||||
encrypting the whole bundle or storing keys separately (existing
|
||||
`backup_pipeline.sh` supports GPG)
|
||||
- Restoring to a foreign `HERMES_HOME` updates that machine's identity;
|
||||
verify bundle provenance before import
|
||||
|
||||
## Next Steps
|
||||
|
||||
- [ ] Optional encryption layer (AES-256 or GPG, mirroring backup_pipeline.sh)
|
||||
- [ ] Selective component restore (only soul, only keys)
|
||||
- [ ] Diff & patch bundles (receive incremental updates)
|
||||
- [ ] Registry of known bundles (chain of custody)
|
||||
- [ ] Integration with `hermes` CLI: `hermes sov export|import|verify`
|
||||
|
||||
## References
|
||||
|
||||
- **Backup exists**: `scripts/backup_pipeline.sh` — encrypted tarball of ~/.hermes
|
||||
- This format complements, does not replace the backup pipeline — it's a
|
||||
structured, portable, versioned alternative for migration & inspection
|
||||
@@ -1,48 +0,0 @@
|
||||
# LUNA-1: Pink Unicorn Game — Project Scaffolding
|
||||
|
||||
Starter project for Mackenzie's Pink Unicorn Game built with **p5.js 1.9.0**.
|
||||
|
||||
## Quick Start
|
||||
|
||||
```bash
|
||||
cd luna
|
||||
python3 -m http.server 8080
|
||||
# Visit http://localhost:8080
|
||||
```
|
||||
|
||||
Or simply open `luna/index.html` directly in a browser.
|
||||
|
||||
## Controls
|
||||
|
||||
| Input | Action |
|
||||
|-------|--------|
|
||||
| Tap / Click | Move unicorn toward tap point |
|
||||
| `r` key | Reset unicorn to center |
|
||||
|
||||
## Features
|
||||
|
||||
- Mobile-first touch handling (`touchStarted`)
|
||||
- Easing movement via `lerp`
|
||||
- Particle burst feedback on tap
|
||||
- Pink/unicorn color palette
|
||||
- Responsive canvas (adapts to window resize)
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
luna/
|
||||
├── index.html # p5.js CDN import + canvas container
|
||||
├── sketch.js # Main game logic and rendering
|
||||
├── style.css # Pink/unicorn theme, responsive layout
|
||||
└── README.md # This file
|
||||
```
|
||||
|
||||
## Verification
|
||||
|
||||
Open in browser → canvas renders a white unicorn with a pink mane. Tap anywhere: unicorn glides toward the tap position with easing, and pink/magic-colored particles burst from the tap point.
|
||||
|
||||
## Technical Notes
|
||||
|
||||
- p5.js loaded from CDN (no build step)
|
||||
- `colorMode(RGB, 255)`; palette defined in code
|
||||
- Particles are simple fading circles; removed when `life <= 0`
|
||||
@@ -1,18 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>LUNA-3: Simple World — Floating Islands</title>
|
||||
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.9.0/p5.min.js"></script>
|
||||
<link rel="stylesheet" href="style.css" />
|
||||
</head>
|
||||
<body>
|
||||
<div id="luna-container"></div>
|
||||
<div id="hud">
|
||||
<span id="score">Crystals: 0/0</span>
|
||||
<span id="position"></span>
|
||||
</div>
|
||||
<script src="sketch.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
289
luna/sketch.js
289
luna/sketch.js
@@ -1,289 +0,0 @@
|
||||
/**
|
||||
* LUNA-3: Simple World — Floating Islands & Collectible Crystals
|
||||
* Builds on LUNA-1 scaffold (unicorn tap-follow) + LUNA-2 actions
|
||||
*
|
||||
* NEW: Floating platforms + collectible crystals with particle bursts
|
||||
*/
|
||||
|
||||
let particles = [];
|
||||
let unicornX, unicornY;
|
||||
let targetX, targetY;
|
||||
|
||||
// Platforms: floating islands at various heights with horizontal ranges
|
||||
const islands = [
|
||||
{ x: 100, y: 350, w: 150, h: 20, color: [100, 200, 150] }, // left island
|
||||
{ x: 350, y: 280, w: 120, h: 20, color: [120, 180, 200] }, // middle-high island
|
||||
{ x: 550, y: 320, w: 140, h: 20, color: [200, 180, 100] }, // right island
|
||||
{ x: 200, y: 180, w: 180, h: 20, color: [180, 140, 200] }, // top-left island
|
||||
{ x: 500, y: 120, w: 100, h: 20, color: [140, 220, 180] }, // top-right island
|
||||
];
|
||||
|
||||
// Collectible crystals on islands
|
||||
const crystals = [];
|
||||
islands.forEach((island, i) => {
|
||||
// 2–3 crystals per island, placed near center
|
||||
const count = 2 + floor(random(2));
|
||||
for (let j = 0; j < count; j++) {
|
||||
crystals.push({
|
||||
x: island.x + 30 + random(island.w - 60),
|
||||
y: island.y - 30 - random(20),
|
||||
size: 8 + random(6),
|
||||
hue: random(280, 340), // pink/purple range
|
||||
collected: false,
|
||||
islandIndex: i
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
let collectedCount = 0;
|
||||
const TOTAL_CRYSTALS = crystals.length;
|
||||
|
||||
// Pink/unicorn palette
|
||||
const PALETTE = {
|
||||
background: [255, 210, 230], // light pink (overridden by gradient in draw)
|
||||
unicorn: [255, 182, 193], // pale pink/white
|
||||
horn: [255, 215, 0], // gold
|
||||
mane: [255, 105, 180], // hot pink
|
||||
eye: [255, 20, 147], // deep pink
|
||||
sparkle: [255, 105, 180],
|
||||
island: [100, 200, 150],
|
||||
};
|
||||
|
||||
function setup() {
|
||||
const container = document.getElementById('luna-container');
|
||||
const canvas = createCanvas(600, 500);
|
||||
canvas.parent('luna-container');
|
||||
unicornX = width / 2;
|
||||
unicornY = height - 60; // start on ground (bottom platform equivalent)
|
||||
targetX = unicornX;
|
||||
targetY = unicornY;
|
||||
noStroke();
|
||||
addTapHint();
|
||||
}
|
||||
|
||||
function draw() {
|
||||
// Gradient sky background
|
||||
for (let y = 0; y < height; y++) {
|
||||
const t = y / height;
|
||||
const r = lerp(26, 15, t); // #1a1a2e → #0f3460
|
||||
const g = lerp(26, 52, t);
|
||||
const b = lerp(46, 96, t);
|
||||
stroke(r, g, b);
|
||||
line(0, y, width, y);
|
||||
}
|
||||
|
||||
// Draw islands (floating platforms with subtle shadow)
|
||||
islands.forEach(island => {
|
||||
push();
|
||||
// Shadow
|
||||
fill(0, 0, 0, 40);
|
||||
ellipse(island.x + island.w/2 + 5, island.y + 5, island.w + 10, island.h + 6);
|
||||
// Island body
|
||||
fill(island.color[0], island.color[1], island.color[2]);
|
||||
ellipse(island.x + island.w/2, island.y, island.w, island.h);
|
||||
// Top highlight
|
||||
fill(255, 255, 255, 60);
|
||||
ellipse(island.x + island.w/2, island.y - island.h/3, island.w * 0.6, island.h * 0.3);
|
||||
pop();
|
||||
});
|
||||
|
||||
// Draw crystals (glowing collectibles)
|
||||
crystals.forEach(c => {
|
||||
if (c.collected) return;
|
||||
push();
|
||||
translate(c.x, c.y);
|
||||
// Glow aura
|
||||
const glow = color(`hsla(${c.hue}, 80%, 70%, 0.4)`);
|
||||
noStroke();
|
||||
fill(glow);
|
||||
ellipse(0, 0, c.size * 2.2, c.size * 2.2);
|
||||
// Crystal body (diamond shape)
|
||||
const ccol = color(`hsl(${c.hue}, 90%, 75%)`);
|
||||
fill(ccol);
|
||||
beginShape();
|
||||
vertex(0, -c.size);
|
||||
vertex(c.size * 0.6, 0);
|
||||
vertex(0, c.size);
|
||||
vertex(-c.size * 0.6, 0);
|
||||
endShape(CLOSE);
|
||||
// Inner sparkle
|
||||
fill(255, 255, 255, 180);
|
||||
ellipse(0, 0, c.size * 0.5, c.size * 0.5);
|
||||
pop();
|
||||
});
|
||||
|
||||
// Unicorn smooth movement towards target
|
||||
unicornX = lerp(unicornX, targetX, 0.08);
|
||||
unicornY = lerp(unicornY, targetY, 0.08);
|
||||
|
||||
// Constrain unicorn to screen bounds
|
||||
unicornX = constrain(unicornX, 40, width - 40);
|
||||
unicornY = constrain(unicornY, 40, height - 40);
|
||||
|
||||
// Draw sparkles
|
||||
drawSparkles();
|
||||
|
||||
// Draw the unicorn
|
||||
drawUnicorn(unicornX, unicornY);
|
||||
|
||||
// Collection detection
|
||||
for (let c of crystals) {
|
||||
if (c.collected) continue;
|
||||
const d = dist(unicornX, unicornY, c.x, c.y);
|
||||
if (d < 35) {
|
||||
c.collected = true;
|
||||
collectedCount++;
|
||||
createCollectionBurst(c.x, c.y, c.hue);
|
||||
}
|
||||
}
|
||||
|
||||
// Update particles
|
||||
updateParticles();
|
||||
|
||||
// Update HUD
|
||||
document.getElementById('score').textContent = `Crystals: ${collectedCount}/${TOTAL_CRYSTALS}`;
|
||||
document.getElementById('position').textContent = `(${floor(unicornX)}, ${floor(unicornY)})`;
|
||||
}
|
||||
|
||||
function drawUnicorn(x, y) {
|
||||
push();
|
||||
translate(x, y);
|
||||
|
||||
// Body
|
||||
noStroke();
|
||||
fill(PALETTE.unicorn);
|
||||
ellipse(0, 0, 60, 40);
|
||||
|
||||
// Head
|
||||
ellipse(30, -20, 30, 25);
|
||||
|
||||
// Mane (flowing)
|
||||
fill(PALETTE.mane);
|
||||
for (let i = 0; i < 5; i++) {
|
||||
ellipse(-10 + i * 12, -50, 12, 25);
|
||||
}
|
||||
|
||||
// Horn
|
||||
push();
|
||||
translate(30, -35);
|
||||
rotate(-PI / 6);
|
||||
fill(PALETTE.horn);
|
||||
triangle(0, 0, -8, -35, 8, -35);
|
||||
pop();
|
||||
|
||||
// Eye
|
||||
fill(PALETTE.eye);
|
||||
ellipse(38, -22, 8, 8);
|
||||
|
||||
// Legs
|
||||
stroke(PALETTE.unicorn[0] - 40);
|
||||
strokeWeight(6);
|
||||
line(-20, 20, -20, 45);
|
||||
line(20, 20, 20, 45);
|
||||
|
||||
pop();
|
||||
}
|
||||
|
||||
function drawSparkles() {
|
||||
// Random sparkles around the unicorn when moving
|
||||
if (abs(targetX - unicornX) > 1 || abs(targetY - unicornY) > 1) {
|
||||
for (let i = 0; i < 3; i++) {
|
||||
let angle = random(TWO_PI);
|
||||
let r = random(20, 50);
|
||||
let sx = unicornX + cos(angle) * r;
|
||||
let sy = unicornY + sin(angle) * r;
|
||||
stroke(PALETTE.sparkle[0], PALETTE.sparkle[1], PALETTE.sparkle[2], 150);
|
||||
strokeWeight(2);
|
||||
point(sx, sy);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function createCollectionBurst(x, y, hue) {
|
||||
// Burst of particles spiraling outward
|
||||
for (let i = 0; i < 20; i++) {
|
||||
let angle = random(TWO_PI);
|
||||
let speed = random(2, 6);
|
||||
particles.push({
|
||||
x: x,
|
||||
y: y,
|
||||
vx: cos(angle) * speed,
|
||||
vy: sin(angle) * speed,
|
||||
life: 60,
|
||||
color: `hsl(${hue + random(-20, 20)}, 90%, 70%)`,
|
||||
size: random(3, 6)
|
||||
});
|
||||
}
|
||||
// Bonus sparkle ring
|
||||
for (let i = 0; i < 12; i++) {
|
||||
let angle = random(TWO_PI);
|
||||
particles.push({
|
||||
x: x,
|
||||
y: y,
|
||||
vx: cos(angle) * 4,
|
||||
vy: sin(angle) * 4,
|
||||
life: 40,
|
||||
color: 'rgba(255, 215, 0, 0.9)',
|
||||
size: 4
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
function updateParticles() {
|
||||
for (let i = particles.length - 1; i >= 0; i--) {
|
||||
let p = particles[i];
|
||||
p.x += p.vx;
|
||||
p.y += p.vy;
|
||||
p.vy += 0.1; // gravity
|
||||
p.life--;
|
||||
p.vx *= 0.95;
|
||||
p.vy *= 0.95;
|
||||
if (p.life <= 0) {
|
||||
particles.splice(i, 1);
|
||||
continue;
|
||||
}
|
||||
push();
|
||||
stroke(p.color);
|
||||
strokeWeight(p.size);
|
||||
point(p.x, p.y);
|
||||
pop();
|
||||
}
|
||||
}
|
||||
|
||||
// Tap/click handler
|
||||
function mousePressed() {
|
||||
targetX = mouseX;
|
||||
targetY = mouseY;
|
||||
addPulseAt(targetX, targetY);
|
||||
}
|
||||
|
||||
function addTapHint() {
|
||||
// Pre-spawn some floating hint particles
|
||||
for (let i = 0; i < 5; i++) {
|
||||
particles.push({
|
||||
x: random(width),
|
||||
y: random(height),
|
||||
vx: random(-0.5, 0.5),
|
||||
vy: random(-0.5, 0.5),
|
||||
life: 200,
|
||||
color: 'rgba(233, 69, 96, 0.5)',
|
||||
size: 3
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
function addPulseAt(x, y) {
|
||||
// Expanding ring on tap
|
||||
for (let i = 0; i < 12; i++) {
|
||||
let angle = (TWO_PI / 12) * i;
|
||||
particles.push({
|
||||
x: x,
|
||||
y: y,
|
||||
vx: cos(angle) * 3,
|
||||
vy: sin(angle) * 3,
|
||||
life: 30,
|
||||
color: 'rgba(233, 69, 96, 0.7)',
|
||||
size: 3
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -1,32 +0,0 @@
|
||||
body {
|
||||
margin: 0;
|
||||
overflow: hidden;
|
||||
background: linear-gradient(to bottom, #1a1a2e, #16213e, #0f3460);
|
||||
font-family: 'Courier New', monospace;
|
||||
color: #e94560;
|
||||
}
|
||||
|
||||
#luna-container {
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100vw;
|
||||
height: 100vh;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
#hud {
|
||||
position: fixed;
|
||||
top: 10px;
|
||||
left: 10px;
|
||||
background: rgba(0, 0, 0, 0.6);
|
||||
padding: 8px 12px;
|
||||
border-radius: 4px;
|
||||
font-size: 14px;
|
||||
z-index: 100;
|
||||
border: 1px solid #e94560;
|
||||
}
|
||||
|
||||
#score { font-weight: bold; }
|
||||
52
scripts/sov
Executable file
52
scripts/sov
Executable file
@@ -0,0 +1,52 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Sovereign Bundle (.sov) command-line wrapper.
|
||||
|
||||
Usage:
|
||||
sov export [-o OUTPUT] [--include-sessions]
|
||||
sov import BUNDLE [--target DIR] [--dry-run]
|
||||
sov verify BUNDLE
|
||||
sov list BUNDLE
|
||||
"""
|
||||
|
||||
import sys
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
|
||||
SCRIPT_DIR = Path(__file__).parent.parent / "timmy-local" / "scripts"
|
||||
CREATE_SCRIPT = SCRIPT_DIR / "create_sov_bundle.py"
|
||||
RESTORE_SCRIPT = SCRIPT_DIR / "restore_sov_bundle.py"
|
||||
|
||||
|
||||
def main():
|
||||
if len(sys.argv) < 2:
|
||||
print(__doc__)
|
||||
sys.exit(1)
|
||||
|
||||
cmd = sys.argv[1]
|
||||
|
||||
if cmd == "export":
|
||||
# Delegate to create_sov_bundle.py
|
||||
args = [sys.executable, str(CREATE_SCRIPT), "export"] + sys.argv[2:]
|
||||
sys.exit(subprocess.run(args).returncode)
|
||||
|
||||
elif cmd in ("import", "restore"):
|
||||
args = [sys.executable, str(RESTORE_SCRIPT)] + sys.argv[2:]
|
||||
sys.exit(subprocess.run(args).returncode)
|
||||
|
||||
elif cmd == "verify":
|
||||
args = [sys.executable, str(RESTORE_SCRIPT), "verify", sys.argv[2]]
|
||||
sys.exit(subprocess.run(args).returncode)
|
||||
|
||||
elif cmd in ("list", "ls"):
|
||||
args = [sys.executable, str(RESTORE_SCRIPT), "--list", sys.argv[2]]
|
||||
sys.exit(subprocess.run(args).returncode)
|
||||
|
||||
else:
|
||||
print(f"Unknown command: {cmd}", file=sys.stderr)
|
||||
print(__doc__)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,73 +0,0 @@
|
||||
# Fleet Operator Incentives Program
|
||||
|
||||
## 1. Overview
|
||||
|
||||
The Fleet Operator Incentives Program is designed to recruit, certify, and retain high-quality fleet operators who will maintain and operate vehicle fleets with >99.5% uptime. Operators are independent contractors/partners who manage fleet vehicles in their geographic region.
|
||||
|
||||
## 2. Operator Tiers & Compensation
|
||||
|
||||
### Tier 1: Certified Operator (Entry)
|
||||
- **Requirements:** Complete operator training, pass certification exam, maintain 99%+ uptime for 30 days
|
||||
- **Compensation:** Base rate $X/vehicle/month + $Y per completed trip
|
||||
- **Benefits:** Access to fleet management tools, priority support, basic insurance options
|
||||
|
||||
### Tier 2: Senior Operator
|
||||
- **Requirements:** 6+ months as Certified Operator, 99.5%+ uptime, mentor 1+ new operator
|
||||
- **Compensation:** Base rate +15% + $Z per completed trip + quarterly bonus
|
||||
- **Benefits:** Higher trip priority, advanced analytics dashboard, referral bonuses
|
||||
|
||||
### Tier 3: Master Operator
|
||||
- **Requirements:** 12+ months, 99.8%+ uptime, mentor 3+ operators, zero critical incidents
|
||||
- **Compensation:** Base rate +30% + highest per-trip rate + annual profit-sharing
|
||||
- **Benefits:** Fleet expansion privileges, dedicated account manager, revenue share on referred partners
|
||||
|
||||
## 3. Performance Metrics & Incentives
|
||||
|
||||
| Metric | Target | Incentive |
|
||||
|--------|--------|-----------|
|
||||
| Uptime | >99.5% | $200/month bonus per 0.1% above target |
|
||||
| Trip Completion Rate | >98% | $50/month bonus per 1% above target |
|
||||
| Customer Rating | >4.8/5.0 | Tier multiplier (1.0x-1.25x) |
|
||||
| Safety Incidents | 0 | $500/month safety bonus |
|
||||
| Referral Conversions | 3+/quarter | $250 per converted referral |
|
||||
|
||||
## 4. Certification Process
|
||||
|
||||
1. **Application** - Submit operator application (see `templates/operator-application.md`)
|
||||
2. **Training** - Complete 40-hour online + 20-hour on-ground training
|
||||
3. **Exam** - Pass written (80%+) and practical assessments
|
||||
4. **Probation** - 30-day supervised operation period
|
||||
5. **Certification** - Full operator status with tier assignment
|
||||
|
||||
## 5. Retention & Churn Reduction
|
||||
|
||||
### Success Criteria
|
||||
- Operator churn <10% annually
|
||||
- Net Promoter Score (NPS) >50 among operators
|
||||
- 90%+ operator renewal rate
|
||||
|
||||
### Retention Strategies
|
||||
- Monthly operator roundtables and feedback sessions
|
||||
- Quarterly operator appreciation events
|
||||
- Tier-based recognition and public accolades
|
||||
- Progressive compensation increases tied to tenure
|
||||
- Operator advisory council influence on policy
|
||||
|
||||
## 6. Fleet Uptime Guarantees
|
||||
|
||||
- **Target:** >99.5% fleet uptime
|
||||
- **SLA Credits:** Operators earn credits toward tier status for maintaining uptime
|
||||
- **Support:** 24/7 dispatch and maintenance coordination
|
||||
- **Preventive Maintenance:** Scheduled maintenance windows with ride credits
|
||||
|
||||
## 7. Quality Assurance
|
||||
|
||||
- Random trip audits (5% minimum)
|
||||
- Quarterly recertification for Tier 2+
|
||||
- Customer feedback monitoring
|
||||
- Incident review board for safety events
|
||||
|
||||
---
|
||||
|
||||
*Last Updated: 2026-Q1*
|
||||
*Owner: Fleet Operations Committee*
|
||||
@@ -1,113 +0,0 @@
|
||||
# Fleet Operations Runbook
|
||||
|
||||
## 1. Daily Operator Checklist
|
||||
|
||||
### Pre-Shift
|
||||
- [ ] Vehicle inspection (tires, fluids, lights, brakes)
|
||||
- [ ] Cleanliness check (interior/exior)
|
||||
- [ ] Battery charge level >80%
|
||||
- [ ] Tire pressure verification
|
||||
- [ ] Update availability status in fleet app
|
||||
|
||||
### During Shift
|
||||
- [ ] Accept trips within assigned zone
|
||||
- [ ] Complete pre-trip safety check in app
|
||||
- [ ] Maintain communication with dispatch
|
||||
- [ ] Log all incidents immediately
|
||||
- [ ] Monitor real-time uptime metrics
|
||||
|
||||
### Post-Shift
|
||||
- [ ] Vehicle cleaning and sanitization
|
||||
- [ ] End-of-day vehicle inspection
|
||||
- [ ] Submit shift report (any issues)
|
||||
- [ ] Charge vehicle to >90%
|
||||
- [ ] Secure vehicle per protocol
|
||||
|
||||
## 2. Maintenance Procedures
|
||||
|
||||
### Routine Maintenance Schedule
|
||||
- **Daily:** Visual inspection, tire pressure, fluid checks
|
||||
- **Weekly:** Brake inspection, battery health check, software updates
|
||||
- **Monthly:** Full service (oil, filters, brakes, alignment, battery test)
|
||||
- **Quarterly:** Comprehensive inspection, certification renewal prep
|
||||
|
||||
### Emergency Maintenance
|
||||
1. Pull over safely and activate hazards
|
||||
2. Contact dispatch via priority line
|
||||
3. Use fleet app to report issue with photos
|
||||
4. Dispatch arranges tow/replacement vehicle
|
||||
5. Complete incident report within 2 hours
|
||||
|
||||
## 3. Incident Response
|
||||
|
||||
### Accident Protocol
|
||||
1. Ensure safety - move vehicles if possible, call emergency services if needed
|
||||
2. Document scene (photos, witness info, police report if applicable)
|
||||
3. Notify dispatch immediately (Priority 1)
|
||||
4. Complete incident report in fleet app within 1 hour
|
||||
5. Cooperate with insurance and safety review
|
||||
|
||||
### Customer Complaint
|
||||
1. Listen actively, de-escalate if needed
|
||||
2. Document complaint details immediately
|
||||
3. Escalate to dispatch supervisor within 15 minutes
|
||||
4. Follow resolution process in fleet app
|
||||
5. Follow up with customer within 24 hours (if appropriate)
|
||||
|
||||
## 4. Dispatch & Communication
|
||||
|
||||
### Contact Channels
|
||||
- **Primary:** Fleet mobile app (push notifications, in-app messaging)
|
||||
- **Urgent:** Dispatch hotline (24/7)
|
||||
- **Routine:** Email/Slack channel
|
||||
- **Emergency:** SMS to +1-xxx-xxx-xxxx
|
||||
|
||||
### Availability Requirements
|
||||
- Certified Operators: Minimum 20 hours/week
|
||||
- Senior Operators: Minimum 25 hours/week
|
||||
- Master Operators: Minimum 30 hours/week + on-call rotations
|
||||
|
||||
## 5. Escalation Matrix
|
||||
|
||||
| Issue Type | First Response | Escalation To | SLA Resolution |
|
||||
|------------|----------------|---------------|----------------|
|
||||
| Vehicle breakdown | 15 minutes | Dispatch Lead | 2 hours |
|
||||
| Accident/incident | Immediate | Safety Manager | 24 hours |
|
||||
| Customer complaint | 30 minutes | Customer Success | 4 hours |
|
||||
| Technical issue | 1 hour | Tech Support | 8 hours |
|
||||
| Payment discrepancy | 2 hours | Finance | 24 hours |
|
||||
|
||||
## 6. Key Performance Indicators (KPIs)
|
||||
|
||||
### Operator KPIs
|
||||
- **Uptime:** Vehicle available >99.5%
|
||||
- **Completion Rate:** Trips completed vs. accepted >98%
|
||||
- **Customer Rating:** Average >4.8/5.0
|
||||
- **Safety:** Zero preventable incidents
|
||||
- **Utilization:** Active hours >80% of scheduled
|
||||
|
||||
### Fleet KPIs
|
||||
- **Vehicle Health:** Maintenance compliance 100%
|
||||
- **Response Time:** Average dispatch acceptance <30 seconds
|
||||
- **Coverage:** Zone availability >95%
|
||||
|
||||
## 7. Troubleshooting Common Issues
|
||||
|
||||
| Issue | Self-Help Steps | Support Ticket |
|
||||
|-------|-----------------|----------------|
|
||||
| App not connecting | 1. Restart app 2. Check data/WiFi 3. Re-login | If unresolved after 5 min |
|
||||
| Vehicle won't start | 1. Check charge 2. Try reset 3. Call dispatch | Always |
|
||||
| Navigation error | 1. Refresh GPS 2. Re-enter destination 3. Use backup map | If trip impacted |
|
||||
| Payment question | 1. Check earnings tab 2. Review last 7 days | Non-urgent |
|
||||
|
||||
## 8. Compliance & Regulations
|
||||
|
||||
- All operators must maintain valid driver's license and clean driving record
|
||||
- Commercial insurance requirements (provided by program)
|
||||
- Local transportation regulations compliance
|
||||
- Background check and drug screening (initial + random)
|
||||
- Safety training certification renewal annually
|
||||
|
||||
---
|
||||
|
||||
*Version: 1.0* | *Effective: 2026-Q1* | *Next Review: 2026-Q2*
|
||||
@@ -1,134 +0,0 @@
|
||||
---
|
||||
application_id: OP-{{YYYYMMDD}}-{{SEQ}}
|
||||
submission_date: {{DATE}}
|
||||
status: pending_review
|
||||
---
|
||||
|
||||
# Fleet Operator Application
|
||||
|
||||
## 1. Personal Information
|
||||
|
||||
| Field | Value |
|
||||
|-------|-------|
|
||||
| Full Legal Name | |
|
||||
| Date of Birth | |
|
||||
| Email Address | |
|
||||
| Phone Number | |
|
||||
| Address (Primary) | |
|
||||
| Address (Vehicle Storage) | |
|
||||
| Driver's License Number | |
|
||||
| License State | |
|
||||
| License Expiration | |
|
||||
| Years Licensed | |
|
||||
|
||||
## 2. Driving & Fleet Experience
|
||||
|
||||
### Driving History
|
||||
- **Total Years Driving:** __
|
||||
- **Years Commercial/Professional:** __
|
||||
- **Accidents (past 3 years):** __
|
||||
- **Traffic Violations (past 3 years):** __
|
||||
- **Safety Courses Completed:** (list)
|
||||
|
||||
### Fleet/Transport Experience
|
||||
- **Previous Fleet Operator Role(s):** (company, dates, fleet size)
|
||||
- **Vehicle Types Operated:** (sedan, SUV, van, truck, EV, etc.)
|
||||
- **Telematics/Fleet App Experience:** (yes/no, systems used)
|
||||
- **Maintenance Experience:** (basic, intermediate, advanced)
|
||||
|
||||
## 3. Availability & Commitment
|
||||
|
||||
### Weekly Availability
|
||||
| Day | Morning (6a-12p) | Afternoon (12p-6p) | Evening (6p-12a) |
|
||||
|-----|-------------------|---------------------|-------------------|
|
||||
| Monday | ☐ | ☐ | ☐ |
|
||||
| Tuesday | ☐ | ☐ | ☐ |
|
||||
| Wednesday | ☐ | ☐ | ☐ |
|
||||
| Thursday | ☐ | ☐ | ☐ |
|
||||
| Friday | ☐ | ☐ | ☐ |
|
||||
| Saturday | ☐ | ☐ | ☐ |
|
||||
| Sunday | ☐ | ☐ | ☐ |
|
||||
|
||||
**Minimum Commitment:** ___ hours per week
|
||||
|
||||
### Geographic Coverage
|
||||
- **Home Base/Preferred Zone:** _______________
|
||||
- **Willing to operate in:** (list additional zones)
|
||||
- **Willing to travel/relocate:** (yes/no, distance limit)
|
||||
|
||||
## 4. Equipment & Resources
|
||||
|
||||
- [ ] **Vehicle Eligible:** Own or lease qualifying vehicle (min. 2020 model, <50k miles)
|
||||
- **Make/Model/Year:** ___________
|
||||
- **VIN:** ___________
|
||||
- **Current Mileage:** ___________
|
||||
- **Insurance:** (provider, policy number, coverage limits)
|
||||
|
||||
- [ ] **Smartphone:** Compatible iOS/Android device with data plan
|
||||
- [ ] **Charging Access:** For EVs - home/work charging available (yes/no)
|
||||
|
||||
- [ ] **Tools/Equipment:** (basic toolkit, cleaning supplies, etc.)
|
||||
|
||||
## 5. Financial & Background
|
||||
|
||||
- **Bank Account for Direct Deposit:** (routing & account)
|
||||
- **Tax Information:** (SSN or EIN, expected business structure)
|
||||
- **Background Check Authorization:** ☐ I authorize criminal and driving record check
|
||||
- **Credit Check Authorization:** ☐ I authorize credit check (if required for vehicle financing)
|
||||
|
||||
## 6. Motivation & References
|
||||
|
||||
### Why do you want to become a Fleet Operator?
|
||||
_(min. 100 words)_
|
||||
|
||||
### What makes you a good candidate for fleet operations?
|
||||
_(reliability, customer service, mechanical aptitude, etc.)_
|
||||
|
||||
### Professional References
|
||||
1. **Name:** _________ **Relationship:** _________ **Phone/Email:** _________
|
||||
2. **Name:** _________ **Relationship:** _________ **Phone/Email:** _________
|
||||
|
||||
## 7. Certifications & Training
|
||||
|
||||
- [ ] Defensive Driving Course (if completed)
|
||||
- [ ] Commercial Driver's License (CDL) - if required for vehicle class
|
||||
- [ ] First Aid/CPR Certification
|
||||
- [ ] Other: _______________
|
||||
|
||||
## 8. Agreement & Signature
|
||||
|
||||
By submitting this application, I certify that:
|
||||
- All information provided is accurate and complete
|
||||
- I consent to background and driving record checks
|
||||
- I will maintain required insurance coverage
|
||||
- I will comply with all fleet policies, safety protocols, and regulations
|
||||
- I understand this is an independent contractor role, not employment
|
||||
- I have read and agree to the Fleet Operator Agreement
|
||||
|
||||
**Signature:** _________________________
|
||||
|
||||
**Date:** _________________
|
||||
|
||||
---
|
||||
|
||||
### Internal Use Only
|
||||
|
||||
| Review Step | Completed By | Date | Notes |
|
||||
|-------------|--------------|------|-------|
|
||||
| Application Received | | | |
|
||||
| Background Check Initiated | | | |
|
||||
| Driving Record Review | | | |
|
||||
| Vehicle Inspection (if applicable) | | | |
|
||||
| Interview Scheduled | | | |
|
||||
| Certification Training Assigned | | | |
|
||||
| Final Approval | | | |
|
||||
|
||||
**Application Status:** ☐ Approved ☐ Denied ☐ Additional Info Needed
|
||||
|
||||
**Tier Assignment:** ☐ Tier 1 (Certified) ☐ Tier 2 (Senior) ☐ Tier 3 (Master)
|
||||
|
||||
**Assigned Zone/Area:** ___________________
|
||||
|
||||
**Mentor (if Tier 1):** ___________________
|
||||
|
||||
**Follow-up Required:** ___________________
|
||||
@@ -1,224 +0,0 @@
|
||||
# Partner Channel Report
|
||||
|
||||
**Reporting Period:** {{START_DATE}} – {{END_DATE}}
|
||||
**Partner ID:** {{PARTNER_ID}}
|
||||
**Partner Name:** {{PARTNER_NAME}}
|
||||
**Report Generated:** {{GENERATION_DATE}}
|
||||
|
||||
---
|
||||
|
||||
## 1. Executive Summary
|
||||
|
||||
| Metric | Period Value | Target | Variance | Trend |
|
||||
|--------|--------------|--------|----------|-------|
|
||||
| Total Leads Generated | {{LEADS_TOTAL}} | — | — | {{LEADS_TREND}} |
|
||||
| Qualified Leads | {{LEADS_QUALIFIED}} | — | — | {{QUALIFIED_TREND}} |
|
||||
| Conversion Rate | {{CONVERSION_RATE}}% | >30% | {{CONVERSION_VARIANCE}}% | {{CONVERSION_TREND}} |
|
||||
| Active Certified Operators | {{ACTIVE_OPERATORS}} | 3-5 | {{OPERATOR_VARIANCE}} | {{OPERATOR_TREND}} |
|
||||
| Operator Churn (annualized) | {{CHURN_RATE}}% | <10% | {{CHURN_VARIANCE}}% | {{CHURN_TREND}} |
|
||||
| Fleet Uptime | {{UPTIME_PCT}}% | >99.5% | {{UPTIME_VARIANCE}}% | {{UPTIME_TREND}} |
|
||||
| Partner Revenue Share | ${{REVENUE_SHARE}} | — | — | {{REVENUE_TREND}} |
|
||||
|
||||
**Key Wins This Period:**
|
||||
-
|
||||
-
|
||||
-
|
||||
|
||||
**Primary Challenges:**
|
||||
-
|
||||
-
|
||||
-
|
||||
|
||||
---
|
||||
|
||||
## 2. Lead Generation & Conversion Funnel
|
||||
|
||||
### Lead Sources
|
||||
| Source | Leads | Qualified % | Conversion % |
|
||||
|--------|-------|-------------|--------------|
|
||||
| Referral (existing operators) | {{LEADS_REFERRAL}} | {{QUAL_REFERRAL}}% | {{CONV_REFERRAL}}% |
|
||||
| Marketing Campaigns | {{LEADS_MARKETING}} | {{QUAL_MARKETING}}% | {{CONV_MARKETING}}% |
|
||||
| Direct Inquiry | {{LEADS_DIRECT}} | {{QUAL_DIRECT}}% | {{CONV_DIRECT}}% |
|
||||
| Events/Networking | {{LEADS_EVENTS}} | {{QUAL_EVENTS}}% | {{CONV_EVENTS}}% |
|
||||
| Other | {{LEADS_OTHER}} | {{QUAL_OTHER}}% | {{CONV_OTHER}}% |
|
||||
|
||||
### Conversion Funnel
|
||||
```
|
||||
Leads Generated: {{LEADS_TOTAL}} → Qualified: {{LEADS_QUALIFIED}} → Applications: {{APPS_SUBMITTED}} → Certified: {{OPERATORS_CERTIFIED}}
|
||||
```
|
||||
|
||||
**Average Time to Certification:** ___ days (Target: ≤45 days)
|
||||
|
||||
---
|
||||
|
||||
## 3. Operator Performance Dashboard
|
||||
|
||||
### Active Operators by Tier
|
||||
|
||||
| Tier | Count | Change vs. Last Period | Average Uptime | Avg Customer Rating |
|
||||
|------|-------|------------------------|----------------|---------------------|
|
||||
| Tier 1 - Certified | {{OP_TIER1}} | {{OP_TIER1_CHG}} | {{UPTIME_TIER1}}% | {{RATING_TIER1}} |
|
||||
| Tier 2 - Senior | {{OP_TIER2}} | {{OP_TIER2_CHG}} | {{UPTIME_TIER2}}% | {{RATING_TIER2}} |
|
||||
| Tier 3 - Master | {{OP_TIER3}} | {{OP_TIER3_CHG}} | {{UPTIME_TIER3}}% | {{RATING_TIER3}} |
|
||||
| **Total** | **{{OP_TOTAL}}** | — | — | — |
|
||||
|
||||
### Top 3 Operators (by performance score)
|
||||
1. **{{OP1_NAME}}** (Tier ___) - Uptime: ___%, Rating: ___/5.0, Trips: ___
|
||||
2. **{{OP2_NAME}}** (Tier ___) - Uptime: ___%, Rating: ___/5.0, Trips: ___
|
||||
3. **{{OP3_NAME}}** (Tier ___) - Uptime: ___%, Rating: ___/5.0, Trips: ___
|
||||
|
||||
### At-Risk Operators (requiring intervention)
|
||||
| Operator | Issue | Action Plan | Owner |
|
||||
|----------|-------|-------------|-------|
|
||||
| | | | |
|
||||
| | | | |
|
||||
|
||||
---
|
||||
|
||||
## 4. Fleet Uptime & Reliability
|
||||
|
||||
### Fleet Overview
|
||||
- **Total Vehicles Managed:** {{VEHICLES_TOTAL}}
|
||||
- **Available Vehicles:** {{VEHICLES_AVAILABLE}} ({{VEHICLES_AVAILABLE_PCT}}%)
|
||||
- **In Maintenance:** {{VEHICLES_MAINT}} ({{VEHICLES_MAINT_PCT}}%)
|
||||
- **Out of Service:** {{VEHICLES_OOS}} ({{VEHICLES_OOS_PCT}}%)
|
||||
|
||||
### Uptime By Vehicle
|
||||
| Vehicle ID | Uptime % | Maintenance Events | Downtime Hours |
|
||||
|------------|----------|-------------------|----------------|
|
||||
| | | | |
|
||||
| | | | |
|
||||
| | | | |
|
||||
|
||||
**Average Fleet Uptime:** {{UPTIME_PCT}}% (Target: >99.5%)
|
||||
|
||||
**Top Downtime Causes:**
|
||||
1. ________________
|
||||
2. ________________
|
||||
3. ________________
|
||||
|
||||
---
|
||||
|
||||
## 5. Financial Summary
|
||||
|
||||
### Partner Compensation
|
||||
| Component | Amount | Notes |
|
||||
|-----------|--------|-------|
|
||||
| Base Referral Fees | ${{BASE_FEES}} | ___ referrals × $___ each |
|
||||
| Operator Performance Bonus | ${{PERF_BONUS}} | Tier-based multipliers |
|
||||
| Fleet Uptime Incentive | ${{UPTIME_INC}} | Uptime >99.5% target |
|
||||
| Other: ______________ | ${{OTHER_INC}} | |
|
||||
| **Total Payout** | **${{TOTAL_PAYOUT}}** | |
|
||||
|
||||
### Cost of Partner Activities
|
||||
- Marketing/Events: ${{COST_MARKETING}}
|
||||
- Training Resources: ${{COST_TRAINING}}
|
||||
- Support/Admin: ${{COST_ADMIN}}
|
||||
- **Total Cost:** ${{TOTAL_COST}}
|
||||
|
||||
**Partner ROI:** {{ROI}}% (Total Payout ÷ Total Cost)
|
||||
|
||||
---
|
||||
|
||||
## 6. Training & Development
|
||||
|
||||
### Operators in Training Pipeline
|
||||
| Trainee | Stage | Progress | Expected Certification |
|
||||
|---------|-------|----------|------------------------|
|
||||
| | Application → Interview | ___% | {{DATE}} |
|
||||
| | Interview → Training | ___% | {{DATE}} |
|
||||
| | Training → Exam | ___% | {{DATE}} |
|
||||
| | Probation → Certified | ___% | {{DATE}} |
|
||||
|
||||
### Training Completion Rates
|
||||
- **Onboarding Completion:** ___% (Target: 100%)
|
||||
- **Certification Exam Pass Rate:** ___% (Target: >85%)
|
||||
- **Average Training Duration:** ___ days
|
||||
|
||||
### Completed Training Sessions This Period
|
||||
- Date: ___ - Topic: ___ - Attendees: ___
|
||||
- Date: ___ - Topic: ___ - Attendees: ___
|
||||
|
||||
---
|
||||
|
||||
## 7. Partner Activities & Outreach
|
||||
|
||||
### Events Attended/Sponsored
|
||||
| Event | Date | Location | Leads Generated | Cost |
|
||||
|-------|------|----------|-----------------|------|
|
||||
| | | | | |
|
||||
| | | | | |
|
||||
| | | | | |
|
||||
|
||||
### Marketing Materials Distributed
|
||||
- Digital ads impressions: ___
|
||||
- Email campaigns sent: ___ (open rate: ___%)
|
||||
- Social media posts: ___
|
||||
- Brochures/flyers: ___
|
||||
|
||||
### Partnership Developments
|
||||
- New partner agreements signed: ___
|
||||
- Existing partner renewals: ___
|
||||
- Partnership meetings held: ___
|
||||
|
||||
---
|
||||
|
||||
## 8. Customer Feedback & Satisfaction
|
||||
|
||||
### Customer Ratings (by operator)
|
||||
- **Average Rating:** {{AVG_RATING}}/5.0 (Target: >4.5)
|
||||
- **5-Star Trip Percentage:** {{PCT_5STAR}}%
|
||||
- **Complaints per 100 trips:** {{COMPLAINTS_PER_100}}
|
||||
|
||||
### Top Praise Themes
|
||||
-
|
||||
-
|
||||
-
|
||||
|
||||
### Top Complaint Themes
|
||||
-
|
||||
-
|
||||
-
|
||||
|
||||
---
|
||||
|
||||
## 9. Issues & Blockers
|
||||
|
||||
### Active Issues (past 30 days)
|
||||
| Issue | Impact | Status | Owner | Resolution ETA |
|
||||
|-------|--------|--------|-------|----------------|
|
||||
| | | | | |
|
||||
| | | | | |
|
||||
|
||||
### Escalated to Program Management
|
||||
-
|
||||
-
|
||||
|
||||
---
|
||||
|
||||
## 10. Action Plan & Next Period Goals
|
||||
|
||||
### Priorities for Next Period
|
||||
1. **Recruitment Target:** ___ new certified operators
|
||||
2. **Performance Improvement:** Address ___ at-risk operators
|
||||
3. **Uptime Focus:** Reduce downtime for vehicles: ___
|
||||
4. **Event/Outreach:** Attend ___ events, generate ___ leads
|
||||
5. **Churn Reduction:** Implement ___ retention initiatives
|
||||
|
||||
### Resource Needs
|
||||
-
|
||||
-
|
||||
|
||||
### Key Milestones
|
||||
| Date | Milestone | Owner |
|
||||
|------|-----------|-------|
|
||||
| | | |
|
||||
| | | |
|
||||
|
||||
---
|
||||
|
||||
**Submitted by:** __________________ (Partner Manager)
|
||||
**Date Submitted:** __________________
|
||||
|
||||
**Reviewed by:** __________________ (Fleet Program Director)
|
||||
**Review Date:** __________________
|
||||
145
tests/test_sov_bundle.py
Normal file
145
tests/test_sov_bundle.py
Normal file
@@ -0,0 +1,145 @@
|
||||
|
||||
import tempfile
|
||||
import zipfile
|
||||
import json
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# Add parent to sys.path for imports
|
||||
import sys
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent / "timmy-local" / "scripts"))
|
||||
|
||||
from create_sov_bundle import create_bundle, get_hermes_home
|
||||
|
||||
|
||||
class TestSOVBundleCreation:
|
||||
"""Test Sovereign Bundle (.sov) format creation and structure."""
|
||||
|
||||
def test_bundle_creates_file(self, tmp_path):
|
||||
"""A .sov bundle is created at the specified output path."""
|
||||
out = tmp_path / "test.sov"
|
||||
result = create_bundle(str(out))
|
||||
|
||||
assert out.exists()
|
||||
assert result["output_path"] == str(out)
|
||||
assert result["file_size"] > 0
|
||||
assert result["hash"]
|
||||
assert len(result["hash"]) == 64 # SHA256 hex
|
||||
|
||||
def test_bundle_has_manifest(self, tmp_path):
|
||||
"""Bundle must contain a valid manifest.json in sov/ hierarchy."""
|
||||
out = tmp_path / "test.sov"
|
||||
create_bundle(str(out))
|
||||
|
||||
with zipfile.ZipFile(out, 'r') as zf:
|
||||
names = zf.namelist()
|
||||
assert "sov/manifest.json" in names
|
||||
manifest = json.loads(zf.read("sov/manifest.json"))
|
||||
assert manifest["version"] == "1.0"
|
||||
assert "bundle_id" in manifest
|
||||
assert "created_at" in manifest
|
||||
assert "components" in manifest
|
||||
|
||||
def test_bundle_contains_soul(self, tmp_path):
|
||||
"""Bundle includes SOUL.md from HERMES_HOME."""
|
||||
out = tmp_path / "test.sov"
|
||||
create_bundle(str(out))
|
||||
|
||||
with zipfile.ZipFile(out, 'r') as zf:
|
||||
names = zf.namelist()
|
||||
assert "sov/soul/SOUL.md" in names
|
||||
|
||||
soul = zf.read("sov/soul/SOUL.md").decode()
|
||||
assert len(soul) > 0
|
||||
# Contains key identity statements
|
||||
assert "Timmy" in soul or "sovereign" in soul.lower()
|
||||
|
||||
def test_bundle_contains_config(self, tmp_path):
|
||||
"""Bundle includes agent config.yaml."""
|
||||
out = tmp_path / "test.sov"
|
||||
create_bundle(str(out))
|
||||
|
||||
with zipfile.ZipFile(out, 'r') as zf:
|
||||
assert "sov/config/config.yaml" in zf.namelist()
|
||||
cfg = zf.read("sov/config/config.yaml").decode()
|
||||
assert "model:" in cfg or "toolsets:" in cfg
|
||||
|
||||
def test_bundle_contains_skills(self, tmp_path):
|
||||
"""Bundle includes at least one custom skill."""
|
||||
out = tmp_path / "test.sov"
|
||||
create_bundle(str(out))
|
||||
|
||||
with zipfile.ZipFile(out, 'r') as zf:
|
||||
skill_files = [n for n in zf.namelist() if n.startswith("sov/skills/") and n.endswith(".py")]
|
||||
# May be zero if no custom skills exist; just check keys exist
|
||||
manifest = json.loads(zf.read("sov/manifest.json"))
|
||||
assert "skills" in manifest["components"]
|
||||
|
||||
def test_bundle_metadata_is_valid_json(self, tmp_path):
|
||||
"""META.json is present and contains required fields."""
|
||||
out = tmp_path / "test.sov"
|
||||
create_bundle(str(out))
|
||||
|
||||
with zipfile.ZipFile(out, 'r') as zf:
|
||||
meta = json.loads(zf.read("sov/META.json"))
|
||||
assert meta["format"] == "sov"
|
||||
assert meta["format_version"] == "1.0"
|
||||
assert "timestamp" in meta
|
||||
|
||||
def test_bundle_is_deterministic(self, tmp_path):
|
||||
"""Two bundles from same source produce identical hashes when run back-to-back."""
|
||||
out1 = tmp_path / "a.sov"
|
||||
out2 = tmp_path / "b.sov"
|
||||
import time
|
||||
create_bundle(str(out1))
|
||||
time.sleep(1.1) # Ensure distinct timestamp
|
||||
create_bundle(str(out2))
|
||||
|
||||
with zipfile.ZipFile(out1) as zf:
|
||||
mf1 = json.loads(zf.read("sov/manifest.json"))
|
||||
with zipfile.ZipFile(out2) as zf:
|
||||
mf2 = json.loads(zf.read("sov/manifest.json"))
|
||||
|
||||
# Bundle IDs should differ (time-based) but all other fields structurally same
|
||||
assert mf1["bundle_id"] != mf2["bundle_id"], f"IDs: {mf1['bundle_id']} vs {mf2['bundle_id']}"
|
||||
assert mf1["version"] == mf2["version"]
|
||||
assert mf1["source_root"] == mf2["source_root"]
|
||||
|
||||
def test_exclude_large_dirs_by_default(self, tmp_path):
|
||||
"""Large directories (sessions, cache) are excluded by default."""
|
||||
out = tmp_path / "test.sov"
|
||||
create_bundle(str(out))
|
||||
|
||||
with zipfile.ZipFile(out, 'r') as zf:
|
||||
names = zf.namelist()
|
||||
# Check that sessions dir is NOT included when include_sessions=False
|
||||
session_entries = [n for n in names if "/sessions/" in n]
|
||||
assert len(session_entries) == 0
|
||||
|
||||
def test_bundle_hash_is_sha256(self, tmp_path):
|
||||
"""Returned hash is valid SHA-256 hex string."""
|
||||
out = tmp_path / "test.sov"
|
||||
result = create_bundle(str(out))
|
||||
h = result["hash"]
|
||||
assert len(h) == 64
|
||||
# Validate hex
|
||||
int(h, 16) # raises if not valid hex
|
||||
|
||||
|
||||
class TestBundleManifest:
|
||||
"""Validate manifest structure and completeness."""
|
||||
|
||||
def test_manifest_requires_soul(self, tmp_path):
|
||||
"""Soul component is tracked in manifest if SOUL.md exists."""
|
||||
out = tmp_path / "test.sov"
|
||||
result = create_bundle(str(out))
|
||||
comp = result["manifest"].get("components", {})
|
||||
# If SOUL.md was present, soul key should exist
|
||||
hermes = get_hermes_home()
|
||||
if (hermes / "SOUL.md").exists():
|
||||
assert "soul" in comp
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import pytest
|
||||
pytest.main([__file__, "-q"])
|
||||
384
timmy-local/scripts/create_sov_bundle.py
Normal file
384
timmy-local/scripts/create_sov_bundle.py
Normal file
@@ -0,0 +1,384 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Sovereign Bundle Format Reference Implementation
|
||||
timmy-home #467 — [FRONTIER] Develop "Sovereign Bundle" (.sov) Export/Import Logic
|
||||
|
||||
.sov format: ZIP-based archive with a verifiable manifest.
|
||||
Structure:
|
||||
sov/
|
||||
manifest.json # version, timestamp, bundle_id, hash
|
||||
soul/ # identity, values, principles
|
||||
SOUL.md
|
||||
config/ # agent configuration
|
||||
config.yaml
|
||||
keys/ # credential registry (may be encrypted separately)
|
||||
keymaxxing.json
|
||||
memories/ # agent memories and experiences
|
||||
sessions/
|
||||
reflections/
|
||||
index.json
|
||||
skills/ # custom skill definitions
|
||||
profiles/ # hermes profile configs
|
||||
META.json # export metadata (agent, timestamp, source)
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
import hashlib
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from typing import Optional, Dict, Any, List
|
||||
|
||||
|
||||
def get_hermes_home() -> Path:
|
||||
"""Resolve HERMES_HOME from environment or default."""
|
||||
hermes_home = os.getenv("HERMES_HOME")
|
||||
if hermes_home:
|
||||
return Path(hermes_home).expanduser()
|
||||
return Path.home() / ".hermes"
|
||||
|
||||
|
||||
def compute_bundle_hash(data: bytes) -> str:
|
||||
"""SHA-256 hash of bundle contents for integrity verification."""
|
||||
return hashlib.sha256(data).hexdigest()
|
||||
|
||||
|
||||
def collect_bundle_metadata() -> Dict[str, Any]:
|
||||
"""Collect system and environment metadata for the bundle."""
|
||||
return {
|
||||
"hostname": os.uname().nodename if hasattr(os, 'uname') else "unknown",
|
||||
"platform": sys.platform,
|
||||
"timestamp": datetime.now(timezone.utc).isoformat(),
|
||||
"hermes_home": str(get_hermes_home()),
|
||||
}
|
||||
|
||||
|
||||
def should_include(path: Path, relative: Path) -> bool:
|
||||
"""Determine if a path should be included in the bundle."""
|
||||
# Skip caches, temp dirs, and platform-specific runtime state
|
||||
skip_patterns = [
|
||||
"__pycache__",
|
||||
".pyc", ".pyo",
|
||||
".git/",
|
||||
".pytest_cache",
|
||||
".venv",
|
||||
"node_modules",
|
||||
"/cache/",
|
||||
"/tmp/",
|
||||
"logs/",
|
||||
"checkpoints/",
|
||||
"sandboxes/",
|
||||
"vps-backups/",
|
||||
]
|
||||
path_str = str(relative)
|
||||
for pat in skip_patterns:
|
||||
if pat in path_str:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def create_bundle(output_path: str,
|
||||
hermes_home: Optional[Path] = None,
|
||||
include_sessions: bool = False,
|
||||
compression: int = zipfile.ZIP_DEFLATED) -> Dict[str, Any]:
|
||||
"""
|
||||
Create a .sov bundle at output_path.
|
||||
|
||||
Params:
|
||||
output_path: Path to write the .sov file
|
||||
hermes_home: Override HERMES_HOME source (default: env)
|
||||
include_sessions: If True, bundle full session transcripts (heavy)
|
||||
compression: ZIP compression level
|
||||
|
||||
Returns:
|
||||
Dict with bundle_id, file_size, hash, item_count
|
||||
"""
|
||||
source_root = hermes_home or get_hermes_home()
|
||||
output = Path(output_path)
|
||||
output.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
bundle_id = f"sov-{datetime.now(timezone.utc).strftime('%Y%m%d-%H%M%S')}"
|
||||
items_written = 0
|
||||
manifest = {
|
||||
"version": "1.0",
|
||||
"bundle_id": bundle_id,
|
||||
"created_at": datetime.now(timezone.utc).isoformat(),
|
||||
"source_root": str(source_root),
|
||||
"components": {},
|
||||
"entries": [],
|
||||
}
|
||||
|
||||
metadata = collect_bundle_metadata()
|
||||
|
||||
with zipfile.ZipFile(output, 'w', compression=compression) as zf:
|
||||
# Write META.json
|
||||
meta_data = {
|
||||
**metadata,
|
||||
"bundle_id": bundle_id,
|
||||
"format": "sov",
|
||||
"format_version": "1.0",
|
||||
}
|
||||
zf.writestr("sov/META.json", json.dumps(meta_data, indent=2))
|
||||
items_written += 1
|
||||
|
||||
# Soul — identity (SOUL.md)
|
||||
soul_src = source_root / "SOUL.md"
|
||||
if soul_src.exists():
|
||||
content = soul_src.read_text()
|
||||
zf.writestr("sov/soul/SOUL.md", content)
|
||||
manifest["components"]["soul"] = {"SOUL.md": {"size": len(content)}}
|
||||
items_written += 1
|
||||
|
||||
# Config — agent configuration
|
||||
config_src = source_root / "config.yaml"
|
||||
if config_src.exists():
|
||||
content = config_src.read_text()
|
||||
zf.writestr("sov/config/config.yaml", content)
|
||||
manifest["components"]["config"] = {"config.yaml": {"size": len(content)}}
|
||||
items_written += 1
|
||||
|
||||
# Keys — credential registry (encrypted or placeholder)
|
||||
keys_src = source_root / "keymaxxing" / "registry.json"
|
||||
if keys_src.exists():
|
||||
content = keys_src.read_text()
|
||||
zf.writestr("sov/keys/keymaxxing.json", content)
|
||||
manifest["components"]["keys"] = {"keymaxxing.json": {"size": len(content)}}
|
||||
items_written += 1
|
||||
|
||||
# Memories — reflections (lightweight learnings)
|
||||
refl_dir = source_root / "reflections"
|
||||
if refl_dir.exists():
|
||||
refl_files = list(refl_dir.glob("*.md")) + list(refl_dir.glob("*.json"))
|
||||
for rf in refl_files:
|
||||
if should_include(rf, rf.relative_to(source_root)):
|
||||
arcname = f"sov/memories/reflections/{rf.name}"
|
||||
content = rf.read_text()
|
||||
zf.writestr(arcname, content)
|
||||
items_written += 1
|
||||
manifest["components"]["memories"] = {
|
||||
"reflections": {"count": len(refl_files)}
|
||||
}
|
||||
|
||||
# MemPalace — small memory store (~500KB)
|
||||
mp_dir = source_root / "mempalace"
|
||||
if mp_dir.exists():
|
||||
mp_files = list(mp_dir.rglob("*"))
|
||||
mp_count = 0
|
||||
for mf in mp_files:
|
||||
if mf.is_file() and should_include(mf, mf.relative_to(source_root)):
|
||||
arcname = f"sov/memories/mempalace/{mf.relative_to(mp_dir)}"
|
||||
content = mf.read_bytes()
|
||||
zf.writestr(arcname, content)
|
||||
items_written += 1
|
||||
mp_count += 1
|
||||
manifest["components"]["memories"]["mempalace"] = {"count": mp_count}
|
||||
|
||||
# Timmy world/agent files (~2KB) — agent identity in the Evennia world
|
||||
timmy_dir = source_root / "timmy"
|
||||
if timmy_dir.exists():
|
||||
timmy_files = list(timmy_dir.rglob("*"))
|
||||
for tf in timmy_files:
|
||||
if tf.is_file() and should_include(tf, tf.relative_to(source_root)):
|
||||
arcname = f"sov/timmy/{tf.relative_to(timmy_dir)}"
|
||||
content = tf.read_bytes()
|
||||
zf.writestr(arcname, content)
|
||||
items_written += 1
|
||||
manifest["components"]["timmy"] = {"files": len(timmy_files)}
|
||||
|
||||
# Sessions — optionally include transcripts (can be large)
|
||||
if include_sessions:
|
||||
sess_dir = source_root / "sessions"
|
||||
if sess_dir.exists():
|
||||
sess_files = list(sess_dir.glob("*.jsonl")) + list(sess_dir.glob("*.json"))
|
||||
for sf in sess_files:
|
||||
if should_include(sf, sf.relative_to(source_root)):
|
||||
arcname = f"sov/memories/sessions/{sf.name}"
|
||||
content = sf.read_text()
|
||||
zf.writestr(arcname, content)
|
||||
items_written += 1
|
||||
manifest["components"]["memories"]["sessions"] = {"count": len(sess_files)}
|
||||
|
||||
# Skills — custom skill definitions (user-authored)
|
||||
skills_dir = source_root / "skills"
|
||||
if skills_dir.exists():
|
||||
for skill_path in skills_dir.rglob("*.py"):
|
||||
if not skill_path.name.startswith('.') and should_include(skill_path, skill_path.relative_to(source_root)):
|
||||
arcname = f"sov/skills/{skill_path.relative_to(skills_dir)}"
|
||||
content = skill_path.read_text()
|
||||
zf.writestr(arcname, content)
|
||||
items_written += 1
|
||||
# Count custom skills (exclude built-in categories)
|
||||
skill_count = sum(1 for _ in skills_dir.rglob("*.py")
|
||||
if not _.name.startswith('.') and should_include(_, _.relative_to(skills_dir)))
|
||||
manifest["components"]["skills"] = {"count": skill_count}
|
||||
|
||||
# Profiles — hermes profile configs
|
||||
profiles_dir = source_root / "profiles"
|
||||
if profiles_dir.exists():
|
||||
for pf in profiles_dir.glob("*.yaml"):
|
||||
if should_include(pf, pf.relative_to(source_root)):
|
||||
arcname = f"sov/profiles/{pf.name}"
|
||||
content = pf.read_text()
|
||||
zf.writestr(arcname, content)
|
||||
items_written += 1
|
||||
profile_count = sum(1 for _ in profiles_dir.glob("*.yaml") if should_include(_, _.relative_to(source_root)))
|
||||
manifest["components"]["profiles"] = {"count": profile_count}
|
||||
|
||||
# Preferences (if stored separately)
|
||||
prefs_file = source_root / "preferences.json"
|
||||
if prefs_file.exists():
|
||||
content = prefs_file.read_text()
|
||||
zf.writestr("sov/config/preferences.json", content)
|
||||
items_written += 1
|
||||
|
||||
# Write manifest.json
|
||||
zf.writestr("sov/manifest.json", json.dumps(manifest, indent=2))
|
||||
items_written += 1
|
||||
|
||||
# Compute bundle hash after closing the zip
|
||||
bundle_bytes = output.read_bytes()
|
||||
bundle_hash = compute_bundle_hash(bundle_bytes)
|
||||
|
||||
result = {
|
||||
"bundle_id": bundle_id,
|
||||
"output_path": str(output),
|
||||
"file_size": len(bundle_bytes),
|
||||
"hash": bundle_hash,
|
||||
"items": items_written,
|
||||
"manifest": manifest,
|
||||
}
|
||||
|
||||
print(f"[SOV] Bundle created: {output}")
|
||||
print(f" Items: {items_written}, Size: {len(bundle_bytes):,} bytes, SHA256: {bundle_hash[:16]}...")
|
||||
return result
|
||||
|
||||
|
||||
def verify_bundle(bundle_path: str) -> Dict[str, Any]:
|
||||
"""Verify a .sov bundle integrity and manifest."""
|
||||
with zipfile.ZipFile(bundle_path, 'r') as zf:
|
||||
# Read manifest
|
||||
try:
|
||||
mf_bytes = zf.read("sov/manifest.json")
|
||||
manifest = json.loads(mf_bytes)
|
||||
except KeyError:
|
||||
raise ValueError("Invalid .sov bundle: missing sov/manifest.json")
|
||||
except json.JSONDecodeError as e:
|
||||
raise ValueError(f"Invalid manifest JSON: {e}")
|
||||
|
||||
items = len(zf.namelist())
|
||||
computed_hash = compute_bundle_hash(Path(bundle_path).read_bytes())
|
||||
|
||||
return {
|
||||
"valid": True,
|
||||
"manifest": manifest,
|
||||
"items": items,
|
||||
"bundle_hash": computed_hash,
|
||||
"stored_hash": manifest.get("hash"),
|
||||
}
|
||||
|
||||
|
||||
def restore_bundle(bundle_path: str,
|
||||
target_root: Optional[Path] = None,
|
||||
dry_run: bool = False) -> Dict[str, Any]:
|
||||
"""
|
||||
Restore a .sov bundle to target_root or HERMES_HOME.
|
||||
|
||||
Params:
|
||||
bundle_path: Path to .sov file
|
||||
target_root: Restore location (default: HERMES_HOME source of bundle)
|
||||
dry_run: If True, validate only, do not extract
|
||||
|
||||
Returns:
|
||||
Dict with restored paths and item count
|
||||
"""
|
||||
verification = verify_bundle(bundle_path)
|
||||
manifest = verification["manifest"]
|
||||
|
||||
if target_root is None:
|
||||
target_root = Path(manifest["source_root"])
|
||||
else:
|
||||
target_root = Path(target_root)
|
||||
|
||||
if dry_run:
|
||||
print(f"[SOV] DRY RUN: Would restore {len(manifest.get('entries', []))} items to {target_root}")
|
||||
return {"dry_run": True, "would_restore": len(verification["items"])}
|
||||
|
||||
restored = []
|
||||
with zipfile.ZipFile(bundle_path, 'r') as zf:
|
||||
for name in zf.namelist():
|
||||
# Safety: only extract sov/ namespace
|
||||
if not name.startswith("sov/"):
|
||||
continue
|
||||
rel = name[4:] # strip sov/
|
||||
dest = target_root / rel
|
||||
|
||||
# Skip manifest itself - used for tracking only
|
||||
if rel == "manifest.json":
|
||||
continue
|
||||
|
||||
# Create parent dirs
|
||||
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Extract and write
|
||||
data = zf.read(name)
|
||||
dest.write_bytes(data)
|
||||
restored.append(rel)
|
||||
|
||||
print(f"[SOV] Restored {len(restored)} items to {target_root}")
|
||||
return {
|
||||
"restored": restored,
|
||||
"count": len(restored),
|
||||
"target": str(target_root),
|
||||
}
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import argparse
|
||||
|
||||
p = argparse.ArgumentParser(description="Sovereign Bundle (.sov) export/import tool")
|
||||
sub = p.add_subparsers(dest="cmd", required=True)
|
||||
|
||||
# Export
|
||||
exp = sub.add_parser("export", help="Create a .sov bundle")
|
||||
exp.add_argument("-o", "--output", default="timmy-sovereign-bundle.sov",
|
||||
help="Output path for .sov file")
|
||||
exp.add_argument("--include-sessions", action="store_true",
|
||||
help="Include full session transcripts (larger bundle)")
|
||||
exp.add_argument("--hermes-home", type=str,
|
||||
help="Override HERMES_HOME source")
|
||||
|
||||
# Import / restore
|
||||
imp = sub.add_parser("import", help="Restore from a .sov bundle")
|
||||
imp.add_argument("bundle", help="Path to .sov file")
|
||||
imp.add_argument("-t", "--target", help="Restore target (default: bundle's source)")
|
||||
imp.add_argument("--dry-run", action="store_true", help="Validate only")
|
||||
|
||||
# Verify
|
||||
ver = sub.add_parser("verify", help="Verify bundle integrity")
|
||||
ver.add_argument("bundle", help="Path to .sov file")
|
||||
|
||||
args = p.parse_args()
|
||||
|
||||
if args.cmd == "export":
|
||||
result = create_bundle(
|
||||
output_path=args.output,
|
||||
hermes_home=Path(args.hermes_home).expanduser() if args.hermes_home else None,
|
||||
include_sessions=args.include_sessions,
|
||||
)
|
||||
print(json.dumps(result, indent=2))
|
||||
|
||||
elif args.cmd == "import":
|
||||
result = restore_bundle(args.bundle, Path(args.target) if args.target else None,
|
||||
dry_run=args.dry_run)
|
||||
print(json.dumps(result, indent=2) if not args.dry_run else None)
|
||||
|
||||
elif args.cmd == "verify":
|
||||
info = verify_bundle(args.bundle)
|
||||
print(f"Bundle: {args.bundle}")
|
||||
print(f" Valid: {info['valid']}")
|
||||
print(f" Items: {info['items']}")
|
||||
print(f" Hash: {info['bundle_hash']}")
|
||||
print(f" Manifest version: {info['manifest'].get('version')}")
|
||||
182
timmy-local/scripts/restore_sov_bundle.py
Normal file
182
timmy-local/scripts/restore_sov_bundle.py
Normal file
@@ -0,0 +1,182 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Restore agent state from a Sovereign Bundle (.sov) file.
|
||||
|
||||
Usage:
|
||||
python restore_sov_bundle.py <bundle.sov> [--target ~/.hermes] [--dry-run]
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import zipfile
|
||||
import argparse
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
|
||||
|
||||
def get_hermes_home() -> Path:
|
||||
hermes_home = os.getenv("HERMES_HOME")
|
||||
if hermes_home:
|
||||
return Path(hermes_home).expanduser()
|
||||
return Path.home() / ".hermes"
|
||||
|
||||
|
||||
def verify_bundle(bundle_path: str) -> dict:
|
||||
"""Verify .sov bundle integrity and return manifest."""
|
||||
with zipfile.ZipFile(bundle_path, 'r') as zf:
|
||||
# Require manifest
|
||||
try:
|
||||
mf = json.loads(zf.read("sov/manifest.json"))
|
||||
except KeyError:
|
||||
raise ValueError("Not a valid .sov bundle: missing sov/manifest.json")
|
||||
except json.JSONDecodeError as e:
|
||||
raise ValueError(f"Manifest JSON decode error: {e}")
|
||||
|
||||
return {
|
||||
"valid": True,
|
||||
"entries": zf.namelist(),
|
||||
"manifest": mf,
|
||||
"size": Path(bundle_path).stat().st_size,
|
||||
}
|
||||
|
||||
|
||||
def restore_bundle(bundle_path: str,
|
||||
target_root: Path = None,
|
||||
dry_run: bool = False) -> dict:
|
||||
"""
|
||||
Extract a .sov bundle to target_root.
|
||||
|
||||
Safety: Only extracts files under sov/ namespace.
|
||||
Does not overwrite existing files by default? (could add --force)
|
||||
"""
|
||||
bundle = Path(bundle_path)
|
||||
if not bundle.exists():
|
||||
raise FileNotFoundError(f"Bundle not found: {bundle_path}")
|
||||
|
||||
info = verify_bundle(bundle_path)
|
||||
manifest = info["manifest"]
|
||||
|
||||
src_root = Path(manifest["source_root"])
|
||||
if target_root is None:
|
||||
target_root = src_root
|
||||
else:
|
||||
target_root = Path(target_root)
|
||||
|
||||
print(f"[SOV] Bundle: {bundle_path}")
|
||||
print(f" Source: {src_root}")
|
||||
print(f" Target: {target_root}")
|
||||
print(f" Created: {manifest.get('created_at')}")
|
||||
print(f" Version: {manifest.get('version')}")
|
||||
|
||||
if dry_run:
|
||||
sov_entries = [n for n in info["entries"] if n.startswith("sov/") and n != "sov/manifest.json"]
|
||||
print(f" DRY RUN: Would restore {len(sov_entries)} items")
|
||||
return {"dry_run": True, "count": len(sov_entries)}
|
||||
|
||||
restored = []
|
||||
errors = []
|
||||
|
||||
with zipfile.ZipFile(bundle_path, 'r') as zf:
|
||||
for name in sorted(zf.namelist()):
|
||||
if not name.startswith("sov/"):
|
||||
continue
|
||||
if name == "sov/manifest.json":
|
||||
continue # Tracked separately
|
||||
|
||||
rel = name[4:] # strip sov/
|
||||
dest = target_root / rel
|
||||
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
try:
|
||||
data = zf.read(name)
|
||||
dest.write_bytes(data)
|
||||
restored.append(rel)
|
||||
except Exception as e:
|
||||
errors.append((rel, str(e)))
|
||||
|
||||
print(f"\n[SOV] Restored {len(restored)} files to {target_root}")
|
||||
if errors:
|
||||
print(f" Errors: {len(errors)}")
|
||||
for path, err in errors:
|
||||
print(f" ✗ {path}: {err}")
|
||||
|
||||
# Print a summary of restored components
|
||||
comp = manifest.get("components", {})
|
||||
for comp_name, details in comp.items():
|
||||
if isinstance(details, dict) and "count" in details:
|
||||
print(f" {comp_name}: {details['count']}")
|
||||
elif isinstance(details, dict):
|
||||
print(f" {comp_name}: {', '.join(details.keys())}")
|
||||
|
||||
return {
|
||||
"restored": restored,
|
||||
"count": len(restored),
|
||||
"errors": errors,
|
||||
"target": str(target_root),
|
||||
}
|
||||
|
||||
|
||||
def list_entries(bundle_path: str) -> None:
|
||||
"""List all entries in a .sov bundle with sizes."""
|
||||
with zipfile.ZipFile(bundle_path, 'r') as zf:
|
||||
manifest = json.loads(zf.read("sov/manifest.json"))
|
||||
entries = sorted([n for n in zf.namelist() if n != "sov/manifest.json"])
|
||||
|
||||
print(f"Bundle ID: {manifest.get('bundle_id')}")
|
||||
print(f"Version: {manifest.get('version')}")
|
||||
print(f"Created: {manifest.get('created_at')}")
|
||||
print(f"Source: {manifest.get('source_root')}")
|
||||
print(f"\nContents ({len(entries)} entries):\n")
|
||||
|
||||
by_category = {}
|
||||
for e in entries:
|
||||
cat = e.split('/')[1] if len(e.split('/')) > 1 else 'root'
|
||||
by_category.setdefault(cat, []).append(e)
|
||||
|
||||
for cat in sorted(by_category):
|
||||
print(f" [{cat}]")
|
||||
for e in by_category[cat]:
|
||||
info = zf.getinfo(e)
|
||||
print(f" {e} ({info.file_size:,} bytes)")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
p = argparse.ArgumentParser(description="Restore Sovereign Bundle (.sov)")
|
||||
p.add_argument("bundle", nargs="?", help="Path to .sov file")
|
||||
p.add_argument("--target", "-t", type=str, help="Restore target directory")
|
||||
p.add_argument("--dry-run", action="store_true", help="Validate without extracting")
|
||||
p.add_argument("--list", "-l", action="store_true", help="List bundle contents")
|
||||
p.add_argument("--yes", "-y", action="store_true", help="Skip confirmation prompt")
|
||||
|
||||
args = p.parse_args()
|
||||
|
||||
if args.list:
|
||||
if not args.bundle:
|
||||
print("Usage: restore_sov_bundle.py --list <bundle.sov>")
|
||||
sys.exit(1)
|
||||
list_entries(args.bundle)
|
||||
sys.exit(0)
|
||||
|
||||
if not args.bundle:
|
||||
p.print_help()
|
||||
sys.exit(1)
|
||||
|
||||
bundle_path = args.bundle
|
||||
if not Path(bundle_path).exists():
|
||||
print(f"Error: Bundle not found: {bundle_path}")
|
||||
sys.exit(1)
|
||||
|
||||
target = Path(args.target) if args.target else None
|
||||
|
||||
# Safety prompt unless dry-run or --yes
|
||||
if not args.dry_run and not args.yes:
|
||||
t = target or get_hermes_home()
|
||||
resp = input(f"Restore to {t}? [y/N] ").strip().lower()
|
||||
if resp != 'y':
|
||||
print("Aborted.")
|
||||
sys.exit(0)
|
||||
|
||||
result = restore_bundle(bundle_path, target_root=target, dry_run=args.dry_run)
|
||||
if result.get("errors"):
|
||||
sys.exit(1)
|
||||
Reference in New Issue
Block a user