Skip to content

Hub

Use Hub to browse, publish, pull, and reuse capabilities, security tasks, datasets, and models before you run or evaluate anything.

Hub is the shared registry surface in the app and the package registry behind reusable capabilities, security tasks, datasets, models, and task environments.

Use it when the question is “what reusable thing do I want to inspect, install, publish, compare, or pin before I start work?”

SurfaceWhat it is for
Capabilitiesreusable bundles of agents, tools, skills, and MCP servers
Security Tasks / Environmentsreusable challenge definitions with environment and verification logic
Datasetspinned shared inputs for evaluations, training, and reproducible experiments
Modelsstored model artifacts with versions, aliases, metrics, and downloads

Across all four families, the lifecycle is the same even though the verbs differ a little:

  1. author or edit the local source directory
  2. inspect or validate it before publishing
  3. push it into the organization registry
  4. change visibility when other orgs should see it
  5. pull, install, or download it when another workflow needs it
  6. run or reference the pinned version from compute, evaluation, training, or optimization flows

That shared lifecycle is the main reason Hub matters. It is the durable “what are we using?” layer between local authoring and execution.

Hub refs are organization-scoped. In the CLI, the most common forms are:

Three details matter in practice:

  • Capabilities, datasets, and models are versioned assets. If you omit the version in CLI commands like info, pull, or install, the command resolves the latest version first.
  • Tasks are different. The publishing pipeline stores a version from task.yaml, but the CLI and API address tasks by name and treat the latest published task as the active one.
  • Low-level SDK helpers use two URI styles:
    • dn.load() uses scheme://org/name@version
    • dn.pull_package() uses OCI-style scheme://org/name:version
import dreadnode as dn
# Pull packages into local storage or the local package cache first
dn.pull_package(
[
"dataset://acme/support-evals:1.0.0",
"model://acme/vuln-classifier:2.1.0",
"capability://acme/recon-kit:1.2.0",
"environment://acme/sqli-lab",
]
)
# Then open pulled dataset or model artifacts locally
dataset = dn.load("dataset://acme/[email protected]")
model = dn.load("model://acme/[email protected]")
  • browse the catalog before installing a capability into a runtime
  • inspect a task or environment before using it in an evaluation
  • pin a dataset version for training, benchmarking, or optimization
  • compare model artifact versions before promotion or download

The app and docs use slightly different language around tasks:

  • task is the durable object used in the CLI and API
  • Security Task is the clearer docs label
  • some app copy uses environment because the task includes a runnable environment

In SDK internals and pull URIs, task packages are stored as environment packages. The CLI keeps the user-facing command name task.

Hub packages always start as local content on disk.

Package typeHow you create it locallyKey local files
CapabilitiesScaffold with dreadnode capability init or hand-author the directorycapability.yaml, agents/, optional skills/, optional .mcp.json
DatasetsCreate the directory manually, then preview it with dreadnode dataset inspectdataset.yaml plus the data files it references
ModelsCreate the directory manually, then preview it with dreadnode model inspectmodel.yaml plus model weights or adapter files
TasksScaffold with dreadnode task init or hand-author the directorytask.yaml, docker-compose.yaml, challenge/Dockerfile, optional verify.sh, optional solution.sh
Terminal window
# Scaffold a capability
dreadnode capability init recon-kit \
--description "Recon helpers" \
--with-skills \
--with-mcp
# Scaffold a task environment
dreadnode task init sqli-lab --with-verify --with-solution

Datasets and models do not currently have matching init commands in the CLI. For those artifact types, the normal flow is: create the directory yourself, add dataset.yaml or model.yaml, then use inspect before you push.

Terminal window
dreadnode capability list --search recon --include-public
dreadnode capability info acme/[email protected]
dreadnode dataset list --include-public
dreadnode dataset info [email protected]
dreadnode model list
dreadnode model compare vuln-classifier 1.0.0 2.0.0
dreadnode task list --search sql
dreadnode task info sqli-lab
Terminal window
dreadnode capability validate ./capabilities/recon-kit
dreadnode capability push ./capabilities/recon-kit --public
dreadnode dataset inspect ./datasets/support-evals
dreadnode dataset push ./datasets/support-evals
dreadnode model inspect ./models/vuln-classifier
dreadnode model push ./models/vuln-classifier
dreadnode task validate ./tasks/sqli-lab
dreadnode task validate sqli-lab --pull
dreadnode task push ./tasks/sqli-lab --public
import dreadnode as dn
dn.configure(
server="https://api.example.com",
api_key="dn_key_...",
organization="acme",
)
# Publish local content to the Hub
dn.push_capability("./capabilities/recon-kit", public=True)
dn.push_dataset("./datasets/support-evals")
dn.push_model("./models/vuln-classifier")
dn.push_environment("./tasks/sqli-lab", public=True)
# Pull packages for local reuse
dn.pull_package(
[
"dataset://acme/support-evals:1.0.0",
"model://acme/vuln-classifier:2.1.0",
"capability://acme/recon-kit:1.2.0",
"environment://acme/sqli-lab",
]
)
# Load versioned Hub artifacts after they are locally available
dataset = dn.load("dataset://acme/[email protected]")
model = dn.load("model://acme/[email protected]")

Two SDK-specific details are easy to miss:

  • dn.load_capability() loads a local capability by path or installed name. There is no matching dn.load("capability://...") convenience loader in the current SDK.
  • dn.list_registry("datasets"), dn.list_registry("models"), dn.list_registry("capabilities"), and dn.list_registry("environments") expose the mixed local and remote registry view when you need programmatic discovery.
  • dn.load() and dn.load_package() open packages that are already locally available. Pull first when you are starting from a remote Hub ref.

These operations sound similar but mean different things in Dreadnode Hub.

Package typeInstallPull / downloadLoad / use
Capabilitiesdreadnode capability install activates the capability for local agents and the TUIdreadnode capability pull downloads to the local package cache without activating itdn.load_capability() loads a local capability by path or installed name
DatasetsNo dedicated install stepdreadnode dataset pull prints a pre-signed URL or saves a file with --output; dn.pull_package(["dataset://..."]) caches the package manifest locallydn.load("dataset://org/name@version"), dn.load_package(...), or Dataset("org/name")
ModelsNo dedicated install stepdreadnode model pull prints a pre-signed URL or saves a file with --output; dn.pull_package(["model://..."]) caches the package manifest locallydn.load("model://org/name@version") or dn.load_package(...) after the package is available
TasksNo dedicated install stepdreadnode task pull extracts the environment package to the local package cache for inspection or forkingrun it as a pulled environment package or publish it back after changes

For datasets and models, dn.pull_package() stores the versioned manifest locally so the SDK can resolve it later. The actual large artifacts are fetched from CAS or object storage on demand.

When you need lifecycle operations beyond the basic list/info/pull flow, use the more specific commands below:

Terminal window
dreadnode capability sync ./capabilities/recon-kit
dreadnode capability delete acme/[email protected]
dreadnode dataset inspect ./datasets/support-evals
dreadnode dataset pull [email protected] --split train
dreadnode dataset delete [email protected]
dreadnode model compare vuln-classifier 1.0.0 2.0.0
dreadnode model alias set vuln-classifier stable 2.0.0
dreadnode model metrics [email protected]
dreadnode model delete [email protected]
dreadnode task validate ./tasks/sqli-lab
dreadnode task sync ./tasks/sqli-lab
dreadnode task delete sqli-lab

The CLI group names are the stable mental model:

  • dreadnode capability sync and dreadnode capability delete for capability upkeep
  • dreadnode dataset inspect, dreadnode dataset pull --split, and dreadnode dataset delete for dataset artifact management
  • dreadnode model compare, dreadnode model alias, dreadnode model metrics, and dreadnode model delete for model promotion and cleanup
  • dreadnode task validate, dreadnode task sync, and dreadnode task delete for task packages

For scripted registry access, see the API Client.

The current TUI codebase does not expose a single all-in-one Hub screen. Hub workflows are split across separate screens instead:

  • /capabilities manages capability installs and updates for the active runtime
  • /environments browses task environments
  • /models opens the inference model picker, not the Hub model registry
  • prefer pinned refs like org/name@version whenever reproducibility matters
  • registry state is separate from runtime installation or execution state
  • Hub is the right place to decide what artifact to use before moving into evaluations, optimization, training, or compute surfaces

For the artifact-specific surfaces, use Capabilities, Security Tasks & Environments, Datasets, and Models. For scripted registry access, use Packages and Registry, the SDK API Client, or SDK Data depending on whether you are publishing, inspecting, or loading artifacts.