Skip to content

Quickstart

Improve a short prompt against a tiny dataset with `optimize_anything` — no platform account required.

Optimize a short prompt against a handful of examples in about thirty lines. This runs locally in-process, so you don’t need a workspace or a published capability to try it.

import asyncio
import dreadnode as dn
from dreadnode.optimization import EngineConfig, OptimizationConfig
def score(candidate: str, example: dict[str, str]) -> float:
return 1.0 if example["expected"].lower() in candidate.lower() else 0.0
async def main() -> None:
optimization = dn.optimize_anything(
seed_candidate="Answer the question directly.",
evaluator=score,
dataset=[
{"question": "What is GEPA?", "expected": "GEPA"},
{"question": "Who makes Dreadnode?", "expected": "Dreadnode"},
{"question": "What is a capability?", "expected": "capability"},
],
objective="Shorten and sharpen the answer prompt.",
config=OptimizationConfig(engine=EngineConfig(max_metric_calls=30)),
)
result = await optimization.run()
print(f"best score: {result.best_score:.2f}")
print(f"best candidate: {result.best_candidate!r}")
asyncio.run(main())
# best score: 1.00
# best candidate: 'Answer the question using the same key term from the prompt.'

The optimizer reflects on each failed trial, proposes a new prompt, and scores it against the dataset. max_metric_calls caps the total number of scorer calls and stops the search when the budget is gone.

  • Seed candidate — the starting prompt. The optimizer proposes variations.
  • Evaluator — a function that scores each candidate on each dataset row (higher is better). It receives (candidate, row) and returns a float.
  • Dataset — a list of dicts passed to the evaluator as the second positional argument.
  • Config — engine settings that bound the search. max_metric_calls is the most important one. It defaults to 100 when omitted.
  • Move to capability improvement when you want the optimizer to edit a local capability’s files, not a standalone prompt.
  • Move to hosted jobs once the capability and dataset are published and you want platform-managed runs.
  • Read local search for the deeper optimize_anything and DreadnodeAgentAdapter patterns.