test: make local llama.cpp the default runtime #77

Merged
Timmy merged 1 commits from feat/issue-73-local-default into main 2026-03-28 05:33:47 +00:00
Owner

Summary

  • make config.yaml default to the local llama.cpp runtime instead of Codex/cloud
  • pin the local custom provider model to hermes4:14b
  • add a regression test so the default runtime cannot silently drift back to cloud

Test Plan

  • python3 -m pytest tests/test_local_runtime_defaults.py -q
  • python3 -m pytest tests -q

Closes #73

## Summary - make `config.yaml` default to the local llama.cpp runtime instead of Codex/cloud - pin the local custom provider model to `hermes4:14b` - add a regression test so the default runtime cannot silently drift back to cloud ## Test Plan - `python3 -m pytest tests/test_local_runtime_defaults.py -q` - `python3 -m pytest tests -q` Closes #73
Timmy added 1 commit 2026-03-28 05:33:46 +00:00
Timmy merged commit f263156cf1 into main 2026-03-28 05:33:47 +00:00
Timmy deleted branch feat/issue-73-local-default 2026-03-28 05:33:48 +00:00
Sign in to join this conversation.