Merge pull request #1376 from NousResearch/hermes/hermes-781f9235-docs

docs: clarify saved custom endpoint routing
This commit is contained in:
Teknium 2026-03-14 21:15:24 -07:00 committed by GitHub
commit 6b1adb7eb1
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
3 changed files with 26 additions and 6 deletions

View file

@ -50,6 +50,8 @@ hermes config set OPENAI_API_KEY ollama # Any non-empty va
hermes config set HERMES_MODEL llama3.1
```
You can also save the endpoint interactively with `hermes model`. Hermes persists that custom endpoint in `config.yaml`, and auxiliary tasks configured with provider `main` follow the same saved endpoint.
This works with Ollama, vLLM, llama.cpp server, SGLang, LocalAI, and others. See the [Configuration guide](../user-guide/configuration.md) for details.
### How much does it cost?