Port from openai/codex#17667: MCP servers can now opt-in to parallel
tool execution by setting supports_parallel_tool_calls: true in their
config. This allows tools from the same server to run concurrently
within a single tool-call batch, matching the behavior already available
for built-in tools like web_search and read_file.
Previously all MCP tools were forced sequential because they weren't in
the _PARALLEL_SAFE_TOOLS set. Now _should_parallelize_tool_batch checks
is_mcp_tool_parallel_safe() which looks up the server's config flag.
Config example:
mcp_servers:
docs:
command: "docs-server"
supports_parallel_tool_calls: true
Changes:
- tools/mcp_tool.py: Track parallel-safe servers in _parallel_safe_servers
set, populated during register_mcp_servers(). Add is_mcp_tool_parallel_safe()
public API.
- run_agent.py: Add _is_mcp_tool_parallel_safe() lazy-import wrapper. Update
_should_parallelize_tool_batch() to check MCP tools against server config.
- 11 new tests covering the feature end-to-end.
- Updated MCP docs and config reference.
Expand the MCP feature docs with filtering and capability-aware registration details, add a practical 'Use MCP with Hermes' tutorial, add a config reference page, and wire the new docs into the sidebar and landing page.