Pull Hermes 4 14B — inference (GGUF) + training (MLX) models #9
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Goal
Replace all non-Hermes models with Hermes 4 14B as the new local Timmy brain.
Current State
57GB freed by deleting qwen, glm, kimi, deepseek, llama models. Only Hermes remains:
hermes3:8b(4.7GB) — LoRA training base for v0-v0.2hermes4:36b(21GB) — manually imported, tight fittimmy:v0.1-q4(4.9GB) — first fine-tuned TimmyModels to Pull
1. Inference (GGUF for Ollama)
bartowski/NousResearch_Hermes-4-14B-GGUF2. Training (MLX for LoRA/DPO)
mlx-community/Hermes-4-14B-4bitSteps
Fallback: VPS + rsync
If hotspot chokes on the ~9GB GGUF download:
After Pull
hermes4:14bresponds correctly in Ollamahermes4:36b(21GB) once 14B is provenTags
Both repos tagged
pre-agent-workers-v1as rollback points before this work.Downloads running via VPS + rsync (2026-03-26 07:25)
VPS pulling both models on datacenter bandwidth. Background watcher on Mac rsyncs when ready.
root@143.198.27.163:/tmp/hermes4-gguf/root@143.198.27.163:/tmp/hermes4-mlx/~/.hermes/logs/hermes4-14b-download.logRemaining after download: create Ollama Modelfile, import, verify inference, update training config.
⚡ Dispatched to
claude. Huey task queued.⚡ Dispatched to
gemini. Huey task queued.⚡ Dispatched to
kimi. Huey task queued.⚡ Dispatched to
grok. Huey task queued.⚡ Dispatched to
perplexity. Huey task queued.🔧
geminiworking on this via Huey. Branch:gemini/issue-9🔧
grokworking on this via Huey. Branch:grok/issue-9⚠️
grokproduced no changes for this issue. Skipping.Closing during the 2026-03-28 backlog burn-down.
Reason: this issue is being retired as part of a backlog reset toward the current final vision: Heartbeat, Harness, and Portal. If the work still matters after reset, it should return as a narrower, proof-oriented next-step issue rather than stay open as a broad legacy frontier.