fix: resolve opencode.ai context window to 1M and clean up display formatting

Two issues resolved:

1. Add opencode.ai to _URL_TO_PROVIDER mapping so base_url routes through
   models.dev lookup (which has mimo-v2-pro at 1M context) instead of
   falling back to probing /models (404) and defaulting to 128K.

2. Fix _format_context_length to round cleanly: 1048576 → '1M' instead
   of '1.048576M'. Applies same rounding logic to K values.
This commit is contained in:
Hunter B 2026-04-02 19:59:19 -05:00 committed by Teknium
parent 18140199c3
commit 894e8c8a8f
2 changed files with 9 additions and 2 deletions

View file

@ -295,10 +295,16 @@ def _format_context_length(tokens: int) -> str:
"""Format a token count for display (e.g. 128000 → '128K', 1048576 → '1M')."""
if tokens >= 1_000_000:
val = tokens / 1_000_000
return f"{val:g}M"
rounded = round(val)
if abs(val - rounded) < 0.05:
return f"{rounded}M"
return f"{val:.1f}M"
elif tokens >= 1_000:
val = tokens / 1_000
return f"{val:g}K"
rounded = round(val)
if abs(val - rounded) < 0.05:
return f"{rounded}K"
return f"{val:.1f}K"
return str(tokens)