fix(config): correct fallback model from kimi-for-coding to kimi-k2.5 #222

Closed
Timmy wants to merge 3 commits from fix/kimi-fallback-model into main

3 Commits

Author SHA1 Message Date
3433b8514a fix(kimi): purge kimi-for-coding from model lists, tests, docs (#lazzyPit)
Some checks failed
Forge CI / smoke-and-build (pull_request) Failing after 45s
kimi-for-coding triggers 401/403 access-terminated errors.
Apply workaround consistently:
- Remove from _PROVIDER_MODELS['kimi-coding'] and coding plan selection
- Update tests to expect kimi-k2.5 instead
- Update docs and reports
- Live config on Beta VPS also corrected
2026-04-07 16:13:12 +00:00
a8eb7dfbad feat(provider): first-class Ollama support + Gemma 4 defaults (#169)
Some checks failed
Forge CI / smoke-and-build (pull_request) Failing after 32s
- Add 'ollama' to CLI provider choices and auth aliases
- Wire Ollama through resolve_provider_client with auto-detection
- Add _try_ollama to auxiliary fallback chain (before local/custom)
- Add ollama to vision provider order
- Update model_metadata.py: ollama prefix + gemma-4-* context lengths (256K)
- Default model: gemma4:12b when provider=ollama
2026-04-07 15:55:50 +00:00
dd0fa2d1a1 fix(config): correct fallback model from kimi-for-coding to kimi-k2.5
All checks were successful
Forge CI / smoke-and-build (pull_request) Successful in 47s
The kimi-for-coding model triggers 403 access-terminated errors.
Switch fallback config to use kimi-k2.5 which is valid for Hermes gateways.

Refs: #lazzyPit
2026-04-07 15:40:00 +00:00