fix(tui): address PR #13231 review comments

Six small fixes, all valid review feedback:

- gatewayClient: onTimeout is now a class-field arrow so setTimeout gets a
  stable reference — no per-request bind allocation (the whole point of
  the original refactor).
- memory: growth rate was lifetime average of rss/uptime, which reports
  phantom growth for stable processes. Now computed as delta since a
  module-load baseline (STARTED_AT). Sanity-checked: 0.00 MB/hr at
  steady-state, non-zero after an allocation.
- hermes_cli: NODE_OPTIONS merge is now token-aware — respects a
  user-supplied --max-old-space-size (don't downgrade a deliberate 16GB
  setting) and avoids duplicating --expose-gc.
- useVirtualHistory: if items shrink past the frozen range's start
  mid-freeze (/clear, compaction), drop the freeze and fall through to
  the normal range calc instead of collapsing to an empty mount.
- circularBuffer: throw on non-positive capacity instead of silently
  producing NaN indices.
- debug slash help: /heapdump mentions HERMES_HEAPDUMP_DIR override
  instead of hardcoding the default path.

Validation: tsc clean, eslint clean, vitest 102/102, growth-rate smoke
test confirms baseline=0 → post-alloc>0.
This commit is contained in:
Brooklyn Nicholson 2026-04-20 19:09:09 -05:00
parent 82b927777c
commit e1ce7c6b1f
6 changed files with 30 additions and 16 deletions

View file

@ -1005,15 +1005,15 @@ def _launch_tui(resume_session_id: Optional[str] = None, tui_dev: bool = False):
env.setdefault("HERMES_CWD", os.getcwd())
# Guarantee an 8GB V8 heap + exposed GC for the TUI. Default node cap is
# ~1.54GB depending on version and can fatal-OOM on long sessions with
# large transcripts / reasoning blobs. Append (don't clobber) any user
# NODE_OPTIONS.
_existing_node_opts = env.get("NODE_OPTIONS", "").strip()
_hermes_tui_node_opts = "--max-old-space-size=8192 --expose-gc"
env["NODE_OPTIONS"] = (
f"{_existing_node_opts} {_hermes_tui_node_opts}".strip()
if _hermes_tui_node_opts not in _existing_node_opts
else _existing_node_opts
)
# large transcripts / reasoning blobs. Token-level merge: respect any
# user-supplied --max-old-space-size (they may have set it higher) and
# avoid duplicating --expose-gc.
_tokens = env.get("NODE_OPTIONS", "").split()
if not any(t.startswith("--max-old-space-size=") for t in _tokens):
_tokens.append("--max-old-space-size=8192")
if "--expose-gc" not in _tokens:
_tokens.append("--expose-gc")
env["NODE_OPTIONS"] = " ".join(_tokens)
if resume_session_id:
env["HERMES_TUI_RESUME"] = resume_session_id