Local/custom OpenAI-compatible providers (Ollama, LM Studio, vLLM) that don't require auth were hitting empty api_key rejections from the OpenAI SDK, especially when used as smart model routing targets. Uses the same 'no-key-required' placeholder already used in _resolve_openrouter_runtime() for the identical scenario. Salvaged from PR #3543. Co-authored-by: scottlowry <scottlowry@users.noreply.github.com>
16 KiB
16 KiB