mirror of
https://github.com/NousResearch/hermes-agent.git
synced 2026-04-26 01:01:40 +00:00
Generates a full dedicated Docusaurus page for every one of the 132 skills
(73 bundled + 59 optional) under website/docs/user-guide/skills/{bundled,optional}/<category>/.
Each page carries the skill's description, metadata (version, author, license,
dependencies, platform gating, tags, related skills cross-linked to their own
pages), and the complete SKILL.md body that Hermes loads at runtime.
Previously the two catalog pages just listed skills with a one-line blurb and
no way to see what the skill actually did — users had to go read the source
repo. Now every skill has a browsable, searchable, cross-linked reference in
the docs.
- website/scripts/generate-skill-docs.py — generator that reads skills/ and
optional-skills/, writes per-skill pages, regenerates both catalog indexes,
and rewrites the Skills section of sidebars.ts. Handles MDX escaping
(outside fenced code blocks: curly braces, unsafe HTML-ish tags) and
rewrites relative references/*.md links to point at the GitHub source.
- website/docs/reference/skills-catalog.md — regenerated; each row links to
the new dedicated page.
- website/docs/reference/optional-skills-catalog.md — same.
- website/sidebars.ts — Skills section now has Bundled / Optional subtrees
with one nested category per skill folder.
- .github/workflows/{docs-site-checks,deploy-site}.yml — run the generator
before docusaurus build so CI stays in sync with the source SKILL.md files.
Build verified locally with `npx docusaurus build`. Only remaining warnings
are pre-existing broken link/anchor issues in unrelated pages.
4.3 KiB
4.3 KiB
| title | sidebar_label | description |
|---|---|---|
| Huggingface Hub | Huggingface Hub | Hugging Face Hub CLI (hf) — search, download, and upload models and datasets, manage repos, query datasets with SQL, deploy inference endpoints, manage Space... |
{/* This page is auto-generated from the skill's SKILL.md by website/scripts/generate-skill-docs.py. Edit the source SKILL.md, not this page. */}
Huggingface Hub
Hugging Face Hub CLI (hf) — search, download, and upload models and datasets, manage repos, query datasets with SQL, deploy inference endpoints, manage Spaces and buckets.
Skill metadata
| Source | Bundled (installed by default) |
| Path | skills/mlops/huggingface-hub |
| Version | 1.0.0 |
| Author | Hugging Face |
| License | MIT |
Reference: full SKILL.md
:::info The following is the complete skill definition that Hermes loads when this skill is triggered. This is what the agent sees as instructions when the skill is active. :::
Hugging Face CLI (hf) Reference Guide
The hf command is the modern command-line interface for interacting with the Hugging Face Hub, providing tools to manage repositories, models, datasets, and Spaces.
IMPORTANT: The
hfcommand replaces the now deprecatedhuggingface-clicommand.
Quick Start
- Installation:
curl -LsSf https://hf.co/cli/install.sh | bash -s - Help: Use
hf --helpto view all available functions and real-world examples. - Authentication: Recommended via
HF_TOKENenvironment variable or the--tokenflag.
Core Commands
General Operations
hf download REPO_ID: Download files from the Hub.hf upload REPO_ID: Upload files/folders (recommended for single-commit).hf upload-large-folder REPO_ID LOCAL_PATH: Recommended for resumable uploads of large directories.hf sync: Sync files between a local directory and a bucket.hf env/hf version: View environment and version details.
Authentication (hf auth)
login/logout: Manage sessions using tokens from huggingface.co/settings/tokens.list/switch: Manage and toggle between multiple stored access tokens.whoami: Identify the currently logged-in account.
Repository Management (hf repos)
create/delete: Create or permanently remove repositories.duplicate: Clone a model, dataset, or Space to a new ID.move: Transfer a repository between namespaces.branch/tag: Manage Git-like references.delete-files: Remove specific files using patterns.
Specialized Hub Interactions
Datasets & Models
- Datasets:
hf datasets list,info, andparquet(list parquet URLs). - SQL Queries:
hf datasets sql SQL— Execute raw SQL via DuckDB against dataset parquet URLs. - Models:
hf models listandinfo. - Papers:
hf papers list— View daily papers.
Discussions & Pull Requests (hf discussions)
- Manage the lifecycle of Hub contributions:
list,create,info,comment,close,reopen, andrename. diff: View changes in a PR.merge: Finalize pull requests.
Infrastructure & Compute
- Endpoints: Deploy and manage Inference Endpoints (
deploy,pause,resume,scale-to-zero,catalog). - Jobs: Run compute tasks on HF infrastructure. Includes
hf jobs uvfor running Python scripts with inline dependencies andstatsfor resource monitoring. - Spaces: Manage interactive apps. Includes
dev-modeandhot-reloadfor Python files without full restarts.
Storage & Automation
- Buckets: Full S3-like bucket management (
create,cp,mv,rm,sync). - Cache: Manage local storage with
list,prune(remove detached revisions), andverify(checksum checks). - Webhooks: Automate workflows by managing Hub webhooks (
create,watch,enable/disable). - Collections: Organize Hub items into collections (
add-item,update,list).
Advanced Usage & Tips
Global Flags
--format json: Produces machine-readable output for automation.-q/--quiet: Limits output to IDs only.
Extensions & Skills
- Extensions: Extend CLI functionality via GitHub repositories using
hf extensions install REPO_ID. - Skills: Manage AI assistant skills with
hf skills add.