fix(cli): support model validation for anthropic_messages and cloudflare-protected endpoints

- probe_api_models: add api_mode param; use x-api-key + anthropic-version
  headers for anthropic_messages mode (Anthropic's native Models API auth)
- probe_api_models: add User-Agent header to avoid Cloudflare 403 blocks
  on third-party OpenAI-compatible endpoints
- validate_requested_model: pass api_mode through from switch_model
- validate_requested_model: for anthropic_messages mode, attempt probe with
  correct auth; if probe fails (many proxies don't implement /v1/models),
  accept the model with an informational warning instead of rejecting
- fetch_api_models: propagate api_mode to probe_api_models
This commit is contained in:
wangshengyang2004 2026-04-20 17:47:00 +08:00 committed by Teknium
parent 25465fd8d7
commit 647900e813
4 changed files with 76 additions and 8 deletions

View file

@ -566,8 +566,11 @@ class TestValidateApiFallback:
base_url="http://localhost:8000",
)
# Unreachable /models on a custom endpoint no longer hard-rejects —
# the model is persisted with a warning so Cloudflare-protected /
# proxy endpoints that don't expose /models still work. See #12950.
assert result["accepted"] is False
assert result["persist"] is False
assert result["persist"] is True
assert "http://localhost:8000/v1/models" in result["message"]
assert "http://localhost:8000/v1" in result["message"]