fix(cli): allow custom/local endpoints without API key

Local LLM servers (llama.cpp, ollama, vLLM, etc.) typically don't
require authentication. When a custom base_url is configured but no
API key is found, use a placeholder instead of failing with
'Provider resolver returned an empty API key.'

The OpenAI SDK accepts any string as api_key, and local servers
simply ignore the Authorization header.

Fixes issue reported by @ThatWolfieGuy — llama.cpp stopped working
after updating because the new runtime provider resolver enforces
non-empty API keys even for keyless local endpoints.
This commit is contained in:
Teknium 2026-03-22 16:08:21 -07:00
parent 1f21ef7488
commit 1b5fb36c9d
No known key found for this signature in database

18
cli.py
View file

@ -1762,8 +1762,22 @@ class HermesCLI:
resolved_acp_command = runtime.get("command")
resolved_acp_args = list(runtime.get("args") or [])
if not isinstance(api_key, str) or not api_key:
self.console.print("[bold red]Provider resolver returned an empty API key.[/]")
return False
# Custom / local endpoints (llama.cpp, ollama, vLLM, etc.) often
# don't require authentication. When a base_url IS configured but
# no API key was found, use a placeholder so the OpenAI SDK
# doesn't reject the request and local servers just ignore it.
_source = runtime.get("source", "")
_has_custom_base = isinstance(base_url, str) and base_url and "openrouter.ai" not in base_url
if _has_custom_base:
api_key = "no-key-required"
logger.debug(
"No API key for custom endpoint %s (source=%s), "
"using placeholder — local servers typically ignore auth",
base_url, _source,
)
else:
self.console.print("[bold red]Provider resolver returned an empty API key.[/]")
return False
if not isinstance(base_url, str) or not base_url:
self.console.print("[bold red]Provider resolver returned an empty base URL.[/]")
return False