[claude] Onboard ollama as local agent (#397) #398
Reference in New Issue
Block a user
Delete Branch "claude/issue-397"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Fixes #397
What
Adds
ollamato the agent status board in the Nexus 3D scene.Why
Ollama is a local-first LLM runner — it exposes an OpenAI-compatible API at
localhost:11434, making it a natural fit as a sovereign, local dispatch agent. The Hermes fallback chain already supports custom local endpoints (provider: custom,base_url: http://localhost:8080/v1); Ollama slots in there directly.Verdict: yes, worth onboarding. Ollama gives Timmy a fully local inference option — no API keys, no cloud dependency, runs on any machine. Good for privacy-sensitive tasks or offline operation.
Change
{ name: 'ollama', status: 'idle', issue: null, prs_today: 0, local: true }toAGENT_STATUS_STUBinapp.jslocal: truerenders the GREENLOCALbadge on the holo-panel, distinguishing it from cloud agents