feat: onboard ollama as local agent in status board (#397)
Some checks failed
CI / validate (pull_request) Failing after 6s
CI / auto-merge (pull_request) Has been skipped

Add ollama to the AGENT_STATUS_STUB in the 3D agent panel display.
Ollama runs LLMs locally (OpenAI-compatible API at localhost:11434),
so it is marked local: true — consistent with how claude is tagged.

Refs #397
This commit is contained in:
Alexander Whitestone
2026-03-24 10:26:54 -04:00
parent 0c7fb43b2d
commit 93ef328ff3

1
app.js
View File

@@ -4000,6 +4000,7 @@ const AGENT_STATUS_STUB = {
{ name: 'kimi', status: 'working', issue: 'Portal system YAML registry (#5)', prs_today: 2, local: false },
{ name: 'groq', status: 'idle', issue: null, prs_today: 0, local: false },
{ name: 'grok', status: 'dead', issue: null, prs_today: 0, local: false },
{ name: 'ollama', status: 'idle', issue: null, prs_today: 0, local: true },
]
};