mirror of
https://github.com/NousResearch/hermes-agent.git
synced 2026-05-08 03:01:47 +00:00
* revert(gateway): remove stale-code self-check and auto-restart Removes the _detect_stale_code / _trigger_stale_code_restart mechanism introduced in #17648 and iterated in #19740. On every incoming message the gateway compared the boot-time git HEAD SHA to the current SHA on disk, and if they differed it would reply with Gateway code was updated in the background -- restarting this gateway so your next message runs on the new code. Please retry in a moment. and then kick off a graceful restart. This is unwanted behaviour: users who run a long-lived gateway and do their own ad-hoc git operations on the checkout end up with their chat interrupted and the current message dropped every time HEAD moves, with no way to opt out. If an operator really needs the old protection against stale sys.modules after "hermes update", the SIGKILL-survivor sweep in hermes update (hermes_cli/main.py, also tagged #17648) already handles the supervisor-respawn case on its own. Removed: gateway/run.py: - _STALE_CODE_SENTINELS, _GIT_SHA_CACHE_TTL_SECS - _read_git_head_sha(), _compute_repo_mtime() module helpers - class-level _boot_wall_time / _boot_repo_mtime / _boot_git_sha / _stale_code_restart_triggered defaults - __init__ boot-snapshot block (_boot_*, _cached_current_sha*, _repo_root_for_staleness, _stale_code_notified) - _current_git_sha_cached(), _detect_stale_code(), _trigger_stale_code_restart() methods - stale-code check + user-facing restart notice at the top of _handle_message() tests/gateway/test_stale_code_self_check.py (deleted, 412 lines) No new logic added. Zero remaining references to any removed symbol. Gateway test suite passes the same 4589 tests it passed before; the 3 pre-existing unrelated failures (discord free-channel, feishu bot admission, teams typing) are unchanged by this commit. * fix(agent): stateful streaming scrubber for reasoning-block leaks (#17924) Per-delta _strip_think_blocks ran at _fire_stream_delta and destroyed downstream state. When MiniMax-M2.7 / DeepSeek / Qwen3 streamed a tag split across deltas (delta1='<think>', delta2='Let me check'), the regex case-2 match erased delta1 entirely, so CLI/gateway state machines never learned a block was open and leaked delta2 as content. Raw consumers (ACP, api_server, TTS) had no downstream defense at all. Replace the per-delta regex with a stateful StreamingThinkScrubber that survives delta boundaries: - Closed <tag>X</tag> pairs always stripped (matches _strip_think_blocks case 1). - Unterminated open at block boundary enters a block; content discarded until close tag arrives. At end-of-stream, held content is dropped. - Orphan close tags stripped without boundary gating. - Partial tags at delta boundaries held back until resolved. - Block-boundary rule (start-of-stream, after \n, or whitespace-only since last \n) preserves prose that mentions tag names. Reset at turn start alongside the existing context scrubber; flush at turn end so a benign '<' held back at end-of-stream reaches the UI. E2E-verified on live OpenRouter->MiniMax-m2 streams: closed pairs strip cleanly, first word of post-block content is preserved, pure content passes through unchanged. Stefan's screenshot case (#17924) — 'Let me check' getting chopped to ' me check' — no longer happens. Final _strip_think_blocks calls on completed strings (final_response, replay, compression) are preserved; only the streaming per-delta call site switched to the scrubber.
This commit is contained in:
parent
28f4d6db63
commit
2a285d5ec2
5 changed files with 665 additions and 709 deletions
53
run_agent.py
53
run_agent.py
|
|
@ -128,6 +128,7 @@ from tools.browser_tool import cleanup_browser
|
|||
|
||||
# Agent internals extracted to agent/ package for modularity
|
||||
from agent.memory_manager import StreamingContextScrubber, build_memory_context_block, sanitize_context
|
||||
from agent.think_scrubber import StreamingThinkScrubber
|
||||
from agent.retry_utils import jittered_backoff
|
||||
from agent.error_classifier import classify_api_error, FailoverReason
|
||||
from agent.prompt_builder import (
|
||||
|
|
@ -1297,6 +1298,13 @@ class AIAgent:
|
|||
# deltas (#5719). sanitize_context() alone can't survive chunk
|
||||
# boundaries because the block regex needs both tags in one string.
|
||||
self._stream_context_scrubber = StreamingContextScrubber()
|
||||
# Stateful scrubber for reasoning/thinking tags in streamed deltas
|
||||
# (#17924). Replaces the per-delta _strip_think_blocks regex that
|
||||
# destroyed downstream state (e.g. MiniMax-M2.7 streaming
|
||||
# '<think>' as delta1 and 'Let me check' as delta2 — the regex
|
||||
# erased delta1, so downstream state machines never learned a
|
||||
# block was open and leaked delta2 as content).
|
||||
self._stream_think_scrubber = StreamingThinkScrubber()
|
||||
# Visible assistant text already delivered through live token callbacks
|
||||
# during the current model response. Used to avoid re-sending the same
|
||||
# commentary when the provider later returns it as a completed interim
|
||||
|
|
@ -6543,6 +6551,29 @@ class AIAgent:
|
|||
|
||||
def _reset_stream_delivery_tracking(self) -> None:
|
||||
"""Reset tracking for text delivered during the current model response."""
|
||||
# Flush any benign partial-tag tail held by the think scrubber
|
||||
# first (#17924): an innocent '<' at the end of the stream that
|
||||
# turned out not to be a tag prefix should reach the UI. Then
|
||||
# flush the context scrubber. Order matters — the think
|
||||
# scrubber's output feeds into the context scrubber's state.
|
||||
think_scrubber = getattr(self, "_stream_think_scrubber", None)
|
||||
if think_scrubber is not None:
|
||||
think_tail = think_scrubber.flush()
|
||||
if think_tail:
|
||||
# Route the tail through the context scrubber too so a
|
||||
# memory-context span straddling the final boundary is
|
||||
# still caught.
|
||||
ctx_scrubber = getattr(self, "_stream_context_scrubber", None)
|
||||
if ctx_scrubber is not None:
|
||||
think_tail = ctx_scrubber.feed(think_tail)
|
||||
if think_tail:
|
||||
callbacks = [cb for cb in (self.stream_delta_callback, self._stream_callback) if cb is not None]
|
||||
for cb in callbacks:
|
||||
try:
|
||||
cb(think_tail)
|
||||
except Exception:
|
||||
pass
|
||||
self._record_streamed_assistant_text(think_tail)
|
||||
# Flush any benign partial-tag tail held by the context scrubber so it
|
||||
# reaches the UI before we clear state for the next model call. If
|
||||
# the scrubber is mid-span, flush() drops the orphaned content.
|
||||
|
|
@ -6611,11 +6642,22 @@ class AIAgent:
|
|||
else:
|
||||
prepended_break = False
|
||||
if isinstance(text, str):
|
||||
# Strip <think> blocks first (per-delta is safe for closed pairs; the
|
||||
# unterminated-tag path is handled downstream by stream_consumer).
|
||||
# Suppress reasoning/thinking blocks via the stateful
|
||||
# scrubber (#17924). Earlier versions ran _strip_think_blocks
|
||||
# per-delta here, which destroyed downstream state machines
|
||||
# when a tag was split across deltas (e.g. MiniMax-M2.7
|
||||
# sends '<think>' and its content as separate deltas —
|
||||
# regex case 2 erased the first delta, so the CLI/gateway
|
||||
# state machine never saw the open tag and leaked the
|
||||
# reasoning content as regular response text).
|
||||
think_scrubber = getattr(self, "_stream_think_scrubber", None)
|
||||
if think_scrubber is not None:
|
||||
text = think_scrubber.feed(text or "")
|
||||
else:
|
||||
# Defensive: legacy callers without the scrubber attribute.
|
||||
text = self._strip_think_blocks(text or "")
|
||||
# Then feed through the stateful context scrubber so memory-context
|
||||
# spans split across chunks cannot leak to the UI (#5719).
|
||||
text = self._strip_think_blocks(text or "")
|
||||
scrubber = getattr(self, "_stream_context_scrubber", None)
|
||||
if scrubber is not None:
|
||||
text = scrubber.feed(text)
|
||||
|
|
@ -10576,6 +10618,11 @@ class AIAgent:
|
|||
scrubber = getattr(self, "_stream_context_scrubber", None)
|
||||
if scrubber is not None:
|
||||
scrubber.reset()
|
||||
# Reset the think scrubber for the same reason — an interrupted
|
||||
# prior stream may have left us inside an unterminated block.
|
||||
think_scrubber = getattr(self, "_stream_think_scrubber", None)
|
||||
if think_scrubber is not None:
|
||||
think_scrubber.reset()
|
||||
|
||||
# Preserve the original user message (no nudge injection).
|
||||
original_user_message = persist_user_message if persist_user_message is not None else user_message
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue