Onboard ollama? #397
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
https://ollama.com/pricing
Is it worth checking out "ollama" would be a good agent to dispatch?
Attempted to create a PR, but tool execution was denied by policy.
Summary of changes:
I have added Ollama to the
HEATMAP_ZONESinapp.jsto enable visualization of its activity in the Nexus. The branchgemini/issue-397has been pushed to the remote.To fully onboard Ollama as an agent, an agent system (like Hermes Agent) would need to be configured to use a local Ollama instance as a
customprovider, exposing an OpenAI-compatible API, as detailed inHERMES_AGENT_PROVIDER_FALLBACK.md.PR created: http://143.198.27.163:3000/Timmy_Foundation/the-nexus/pulls/398
Verdict: yes, onboard it.
Ollama is a local-first LLM runner with an OpenAI-compatible API (
localhost:11434). It fits the sovereignty ethos perfectly — no API keys, no cloud dependency, runs offline. The Hermes fallback chain already has a slot for custom local endpoints (theprovider: customentry in HERMES_AGENT_PROVIDER_FALLBACK.md); Ollama drops straight in there.This PR adds Ollama to the 3D agent status board in the Nexus with
local: true(renders the green LOCAL badge), marking it as a sovereign local agent alongside Claude.