fix: resolve three high-impact community bugs (#5819, #6893, #3388)

Matrix gateway: fix sync loop never dispatching events (#5819)
- _sync_loop() called client.sync() but never called handle_sync()
  to dispatch events to registered callbacks — _on_room_message was
  registered but never fired for new messages
- Store next_batch token from initial sync and pass as since= to
  subsequent incremental syncs (was doing full initial sync every time)
- 17 comments, confirmed by multiple users on matrix.org

Feishu docs: add interactive card configuration for approvals (#6893)
- Error 200340 is a Feishu Developer Console configuration issue,
  not a code bug — users need to enable Interactive Card capability
  and configure Card Request URL
- Added required 3-step setup instructions to feishu.md
- Added troubleshooting entry for error 200340
- 17 comments from Feishu users

Copilot provider drift: detect GPT-5.x Responses API requirement (#3388)
- GPT-5.x models are rejected on /v1/chat/completions by both OpenAI
  and OpenRouter (unsupported_api_for_model error)
- Added _model_requires_responses_api() to detect models needing
  Responses API regardless of provider
- Applied in __init__ (covers OpenRouter primary users) and in
  _try_activate_fallback() (covers Copilot->OpenRouter drift)
- Fixed stale comment claiming gateway creates fresh agents per message
  (it caches them via _agent_cache since the caching was added)
- 7 comments, reported on Copilot+Telegram gateway
This commit is contained in:
Teknium 2026-04-11 11:07:05 -07:00
parent f459214010
commit 6101be6db4
No known key found for this signature in database
5 changed files with 95 additions and 14 deletions

View file

@ -222,6 +222,12 @@ def test_api_mode_normalizes_provider_case(monkeypatch):
def test_api_mode_respects_explicit_openrouter_provider_over_codex_url(monkeypatch):
"""GPT-5.x models need codex_responses even on OpenRouter.
OpenRouter rejects GPT-5 models on /v1/chat/completions with
``unsupported_api_for_model``. The model-level check overrides
the provider default.
"""
_patch_agent_bootstrap(monkeypatch)
agent = run_agent.AIAgent(
model="gpt-5-codex",
@ -233,7 +239,7 @@ def test_api_mode_respects_explicit_openrouter_provider_over_codex_url(monkeypat
skip_context_files=True,
skip_memory=True,
)
assert agent.api_mode == "chat_completions"
assert agent.api_mode == "codex_responses"
assert agent.provider == "openrouter"