fix: use placeholder api_key for custom providers without credentials (#3604)
Local/custom OpenAI-compatible providers (Ollama, LM Studio, vLLM) that don't require auth were hitting empty api_key rejections from the OpenAI SDK, especially when used as smart model routing targets. Uses the same 'no-key-required' placeholder already used in _resolve_openrouter_runtime() for the identical scenario. Salvaged from PR #3543. Co-authored-by: scottlowry <scottlowry@users.noreply.github.com>
This commit is contained in:
@@ -203,7 +203,7 @@ def _resolve_named_custom_runtime(
|
||||
or _detect_api_mode_for_url(base_url)
|
||||
or "chat_completions",
|
||||
"base_url": base_url,
|
||||
"api_key": api_key,
|
||||
"api_key": api_key or "no-key-required",
|
||||
"source": f"custom_provider:{custom_provider.get('name', requested_provider)}",
|
||||
}
|
||||
|
||||
|
||||
Reference in New Issue
Block a user