[GEMINI-03] llama.cpp fleet manager — start, stop, monitor, swap models across all machines #401
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Part of Epic: #398
We have llama-server on 4 machines. No unified way to manage them.
Build a fleet manager that:
This is the system Timmy does ad-hoc with SSH one-liners. Make it a proper tool.
Acceptance Criteria
fleet-llama.py statusshows all 4 serversfleet-llama.py restart bezalelrestarts via SSHfleet-llama.py swap hermes qwen2.5-coder-7bdownloads and serves