hermes-agent/website/docs
Magicray1217 bc15db6231 docs(docker): add section on connecting to local inference servers (vLLM, Ollama)
Adds a comprehensive guide for connecting Dockerized Hermes to local
inference servers like vLLM and Ollama, covering:
- Docker Compose networking (recommended)
- Standalone Docker run with host.docker.internal / --network host
- Connectivity verification steps
- Ollama-specific example

Closes #12308
2026-04-19 09:17:44 +08:00
..
developer-guide fix(ci): resolve 4 pre-existing main failures (docs lint + 3 stale tests) (#11373) 2026-04-16 20:43:41 -07:00
getting-started docs(execute_code): document project/strict execution modes (#12073) 2026-04-18 01:53:09 -07:00
guides docs: correctness audit — fix wrong values, add missing coverage (#11972) 2026-04-18 01:45:48 -07:00
integrations docs: correctness audit — fix wrong values, add missing coverage (#11972) 2026-04-18 01:45:48 -07:00
reference chore(skills): touchdesigner-mcp follow-ups 2026-04-18 17:43:42 -07:00
user-guide docs(docker): add section on connecting to local inference servers (vLLM, Ollama) 2026-04-19 09:17:44 +08:00
index.md fix(docs): show sidebar on docs homepage 2026-04-16 04:24:45 -07:00