fix(model): preserve custom endpoint credentials and accept cloud models not in /v1/models

When switching models on a custom endpoint (ollama-launch):
- Same-provider switches no longer re-resolve credentials (fixes base_url
  being lost for 'custom' provider on subsequent switches)
- Named providers (ollama-launch) are resolved via user_providers so
  switch_model can find their base_url from config
- Models not in the /v1/models probe but present in the user's saved
  provider config are accepted with a warning instead of rejected
- CLI /model and TUI /model both pass user_providers/custom_providers
  to switch_model so the config model list is available for validation

Closes #15088
This commit is contained in:
kshitijk4poor 2026-04-25 14:10:42 +05:30
parent e5647d7863
commit 0d3d2a2631
4 changed files with 56 additions and 24 deletions

View file

@ -2571,8 +2571,8 @@ def validate_requested_model(
)
return {
"accepted": False,
"persist": False,
"accepted": True,
"persist": True,
"recognized": False,
"message": message,
}