rescue: add ollama as first-class provider for local model inference (cherry-pick from #170) #179

Open
claude wants to merge 1 commits from rescue/ollama-provider into main
Member

Summary

Rescue cherry-pick of the meaningful changes from PR #170 (timmy/issue-169-ollama-provider), which had drifted 466 commits from main and was unmergeable.

Changes cherry-picked

  • feat: add ollama as first-class provider for local model inference — adds hermes_cli/providers.py, extends hermes_cli/models.py with ollama entries, and adds ollama to the --provider CLI choices

Conflict resolution

The hermes_cli/main.py provider choices list was updated to include both gemini and ollama from the cherry-pick. The providers.py file is kept as it contains the Hermes overlay for ollama.

Original PR

PR #170 was blocked by 466 commits of drift. The original branch had 674 files changed. This cherry-pick extracts only the 1 relevant commit.

Fixes #169
Refs #170

Part of org-wide PR hygiene effort — see Timmy_Foundation/the-nexus#916

## Summary Rescue cherry-pick of the meaningful changes from PR #170 (`timmy/issue-169-ollama-provider`), which had drifted 466 commits from `main` and was unmergeable. ### Changes cherry-picked - `feat`: add ollama as first-class provider for local model inference — adds `hermes_cli/providers.py`, extends `hermes_cli/models.py` with ollama entries, and adds `ollama` to the `--provider` CLI choices ### Conflict resolution The `hermes_cli/main.py` provider choices list was updated to include both `gemini` and `ollama` from the cherry-pick. The `providers.py` file is kept as it contains the Hermes overlay for ollama. ### Original PR PR #170 was blocked by 466 commits of drift. The original branch had 674 files changed. This cherry-pick extracts only the 1 relevant commit. Fixes #169 Refs #170 Part of org-wide PR hygiene effort — see Timmy_Foundation/the-nexus#916
perplexity was assigned by claude 2026-04-07 06:21:49 +00:00
claude added 1 commit 2026-04-07 06:21:50 +00:00
feat: add ollama as first-class provider for local model inference (#169)
Some checks failed
Forge CI / smoke-and-build (pull_request) Failing after 2s
e852ec3533
Add 'ollama' as a recognized inference provider so local models (Gemma4,
Hermes3, Hermes4) can run through the agent harness without falling back
to OpenRouter.

Changes:
- hermes_cli/auth.py: Add ollama to PROVIDER_REGISTRY with
  base_url=http://localhost:11434/v1, dummy API key fallback (ollama
  needs no auth), remove 'ollama' -> 'custom' alias
- hermes_cli/main.py: Add 'ollama' to --provider choices
- hermes_cli/models.py: Add ollama model catalog (gemma4, hermes3,
  hermes4, llama3.1, qwen2.5-coder, etc.), label, and provider order
- hermes_cli/providers.py: Add HermesOverlay for ollama, remove
  'ollama' -> 'ollama-cloud' alias

Usage:
  hermes chat -m gemma4 --provider ollama
  hermes --profile gemma4-local chat -q 'hello'

Ollama exposes an OpenAI-compatible API at localhost:11434/v1.
No API key required (dummy 'ollama' token used for credential checks).
Override with OLLAMA_BASE_URL or OLLAMA_API_KEY env vars.

Closes #169
claude requested review from perplexity 2026-04-07 06:21:50 +00:00
Some checks failed
Forge CI / smoke-and-build (pull_request) Failing after 2s
This pull request can be merged automatically.
This branch is out-of-date with the base branch
You are not authorized to merge this pull request.
View command line instructions

Checkout

From your project repository, check out a new branch and test the changes.
git fetch -u origin rescue/ollama-provider:rescue/ollama-provider
git checkout rescue/ollama-provider
Sign in to join this conversation.
No Reviewers
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: Timmy_Foundation/hermes-agent#179