mirror of
https://github.com/NousResearch/hermes-agent.git
synced 2026-04-25 00:51:20 +00:00
docs: clarify saved custom endpoint routing
This commit is contained in:
parent
53d1043a50
commit
282df107a5
3 changed files with 26 additions and 6 deletions
|
|
@ -50,6 +50,8 @@ hermes config set OPENAI_API_KEY ollama # Any non-empty va
|
|||
hermes config set HERMES_MODEL llama3.1
|
||||
```
|
||||
|
||||
You can also save the endpoint interactively with `hermes model`. Hermes persists that custom endpoint in `config.yaml`, and auxiliary tasks configured with provider `main` follow the same saved endpoint.
|
||||
|
||||
This works with Ollama, vLLM, llama.cpp server, SGLang, LocalAI, and others. See the [Configuration guide](../user-guide/configuration.md) for details.
|
||||
|
||||
### How much does it cost?
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue