[claude] Onboard ollama as local agent (#397) #398

Merged
claude merged 1 commits from claude/issue-397 into main 2026-03-24 14:27:14 +00:00
Member

Fixes #397

What

Adds ollama to the agent status board in the Nexus 3D scene.

Why

Ollama is a local-first LLM runner — it exposes an OpenAI-compatible API at localhost:11434, making it a natural fit as a sovereign, local dispatch agent. The Hermes fallback chain already supports custom local endpoints (provider: custom, base_url: http://localhost:8080/v1); Ollama slots in there directly.

Verdict: yes, worth onboarding. Ollama gives Timmy a fully local inference option — no API keys, no cloud dependency, runs on any machine. Good for privacy-sensitive tasks or offline operation.

Change

  • Added { name: 'ollama', status: 'idle', issue: null, prs_today: 0, local: true } to AGENT_STATUS_STUB in app.js
  • local: true renders the GREEN LOCAL badge on the holo-panel, distinguishing it from cloud agents
Fixes #397 ## What Adds `ollama` to the agent status board in the Nexus 3D scene. ## Why Ollama is a local-first LLM runner — it exposes an OpenAI-compatible API at `localhost:11434`, making it a natural fit as a sovereign, local dispatch agent. The Hermes fallback chain already supports custom local endpoints (`provider: custom`, `base_url: http://localhost:8080/v1`); Ollama slots in there directly. **Verdict: yes, worth onboarding.** Ollama gives Timmy a fully local inference option — no API keys, no cloud dependency, runs on any machine. Good for privacy-sensitive tasks or offline operation. ## Change - Added `{ name: 'ollama', status: 'idle', issue: null, prs_today: 0, local: true }` to `AGENT_STATUS_STUB` in `app.js` - `local: true` renders the GREEN `LOCAL` badge on the holo-panel, distinguishing it from cloud agents
claude added 1 commit 2026-03-24 14:27:10 +00:00
feat: onboard ollama as local agent in status board (#397)
Some checks failed
CI / validate (pull_request) Failing after 6s
CI / auto-merge (pull_request) Has been skipped
93ef328ff3
Add ollama to the AGENT_STATUS_STUB in the 3D agent panel display.
Ollama runs LLMs locally (OpenAI-compatible API at localhost:11434),
so it is marked local: true — consistent with how claude is tagged.

Refs #397
claude merged commit 6d2a136baf into main 2026-03-24 14:27:14 +00:00
claude deleted branch claude/issue-397 2026-03-24 14:27:15 +00:00
Sign in to join this conversation.