Skip to content

Models

Versioned model artifacts — trained weights, LoRA adapters, and fine-tunes authored as a directory, published to the registry, and pinned by reference.

A Dreadnode model is a directory with a model.yaml manifest that the platform packages, versions, and serves back by reference. Publish the full weights from a training run, a LoRA adapter for the same base model, or a vendored third-party checkpoint — then pin that version from inference code, downstream training, or an evaluation.

support-assistant/
model.yaml
model.safetensors
tokenizer.json
tokenizer_config.json
special_tokens_map.json
Terminal window
dn model push ./support-assistant # → acme/[email protected]
  1. Train or adapt a model elsewhere — hosted training jobs, a local fine-tune, a vendor checkpoint you want to curate.
  2. Author the directory: a model.yaml, the weights, a tokenizer if the model uses one.
  3. Inspect before publishing — dn model inspect ./path reads model.yaml and previews the artifact list.
  4. Push to the registry with dn model push or dn.push_model(...).
  5. Compare and annotate — attach metrics, tag versions with aliases like champion or staging, pick the release to promote.
  6. Consume from inference code, downstream training, or evaluation harnesses by pinning org/name@version.

The registry is agnostic about what you publish — it tracks the bytes, the manifest, and the metadata. Common shapes:

ShapeTypical manifest settings
Full weightsframework: safetensors, architecture, task, tokenizer files.
LoRA adapterframework: safetensors, base_model: <ref>, adapter files only.
ONNX exportframework: onnx, one or more .onnx files.
Quantized checkpointFramework matching the checkpoint format, size_category set.
Curated third-party checkpointbase_model: <upstream-ref>, license set.

Every version carries a framework, a file list, optional metrics, and optional aliases. Aliases (champion, staging, latest-stable) float across versions so humans can promote without rewriting downstream configs; automation should still pin org/name@version for reproducibility.

Full CLI: dn model. The Hub surfaces the same registry with filters, version comparison, and metrics charts. Hosted training writes weights into workspace storage — see Training → Overview for emitting a checkpoint and then publishing it here.