Tools
Python tools for capabilities — @tool, async tools, error handling, and Toolset for shared state.
Tools are Python functions an agent can call. Dreadnode uses type annotations and Pydantic to generate the schema the model sees, so well-typed function signatures become well-shaped tool calls.
import typing as t
from dreadnode import tool
@tooldef lookup_indicator( indicator: t.Annotated[str, "IP, domain, or hash to investigate"],) -> dict[str, str]: """Look up an indicator in an intel source.""" return {"indicator": indicator, "verdict": "unknown"}The docstring becomes the tool description. typing.Annotated metadata becomes the parameter description. The return type drives serialization.
Where tools live
Section titled “Where tools live”Capability tools come from Python files declared in the manifest:
tools: - tools/intel.pyIf tools: is omitted, the runtime auto-discovers any *.py in the tools/ directory. Set tools: [] to disable entirely.
The loader collects from each file:
- module-level
@tool-decorated functions - module-level
Toolinstances - module-level
Toolsetinstances Toolsetsubclasses that construct with no arguments
Async tools
Section titled “Async tools”Define a tool as async def and the runtime awaits the call automatically. No additional decorator argument needed.
import httpximport typing as t
from dreadnode import tool
@toolasync def fetch_indicator( indicator: t.Annotated[str, "Indicator to look up"],) -> dict[str, str]: """Fetch indicator metadata from the intel API.""" async with httpx.AsyncClient() as client: response = await client.get(f"https://intel.example.com/{indicator}") response.raise_for_status() return response.json()Use async whenever the tool talks to the network, a database, or anything else that would block the event loop. Sync tools are fine for pure-CPU work.
Error handling
Section titled “Error handling”By default, @tool catches json.JSONDecodeError and pydantic.ValidationError and surfaces them to the model as a structured error so it can recover. Other exceptions propagate and abort the turn.
Override the policy with catch:
@tool(catch=True)def risky_lookup(name: str) -> dict[str, str]: """Catch every exception and feed it to the model.""" ...
@tool(catch=[ConnectionError, TimeoutError])def network_lookup(host: str) -> dict[str, str]: """Catch only the listed exceptions; everything else aborts the turn.""" ...
@tool(catch=False)def must_succeed(name: str) -> dict[str, str]: """Propagate everything — turn fails if this raises.""" ...When the runtime catches an exception, the tool result becomes an ErrorModel carrying the exception type and message. The agent sees enough to retry or change approach.
Truncating output
Section titled “Truncating output”Long tool outputs eat context. truncate caps the serialized return value:
@tool(truncate=4000)def list_files(path: str) -> str: """Returns at most 4000 characters of output.""" ...Truncation happens after serialization, before the result is handed to the model.
Automatic output offload
Section titled “Automatic output offload”Even with truncate unset, the runtime guards against runaway tool output. When a serialized return value exceeds 30,000 characters, the agent loop writes the full content to <runtime-working-dir>/.dreadnode/tool-output/<tool-call-id>.txt and replaces the in-context result with a middle-out summary — the first 15K characters, a [... N lines truncated — full output saved to <path>] ... marker, then the last 15K. The model sees the path and can read the file with the standard file-read tool if it needs the full output.
This is automatic; tools don’t need to opt in. Set truncate= explicitly when you want a tighter cap or know the model never needs the long-tail content.
Stateful toolsets
Section titled “Stateful toolsets”Use Toolset when a group of tools shares state — an HTTP session, a cache, a client:
import typing as t
import dreadnode
class IntelTools(dreadnode.Toolset): def __init__(self) -> None: self.cache: dict[str, str] = {}
@dreadnode.tool_method def lookup( self, indicator: t.Annotated[str, "Indicator to investigate"], ) -> dict[str, str]: """Look up an indicator.""" if indicator in self.cache: return {"indicator": indicator, "verdict": self.cache[indicator]} verdict = "unknown" self.cache[indicator] = verdict return {"indicator": indicator, "verdict": verdict}Every method decorated with @dreadnode.tool_method becomes a tool. The instance is constructed once per capability load — state lives for the runtime’s lifetime.
@tool_method accepts the same catch and truncate arguments as @tool.
Toolset subclasses must construct with no arguments — the loader calls MyToolset() directly and skips any class that raises TypeError. Take constructor parameters and your Toolset will be silently dropped from the capability.
Async resources in toolsets
Section titled “Async resources in toolsets”The loader instantiates Toolset subclasses synchronously and never enters an async context. So if your tools need an async resource (an httpx.AsyncClient, a database connection pool, a long-lived MCP client), construct it lazily on first use — not in __init__:
import httpximport typing as tfrom pydantic import PrivateAttr
import dreadnode
class HttpTools(dreadnode.Toolset): _client: httpx.AsyncClient | None = PrivateAttr(default=None)
def _ensure_client(self) -> httpx.AsyncClient: if self._client is None: self._client = httpx.AsyncClient(timeout=30) return self._client
@dreadnode.tool_method async def fetch( self, url: t.Annotated[str, "URL to fetch"], ) -> str: """Fetch a URL and return the body.""" response = await self._ensure_client().get(url) response.raise_for_status() return response.textUse PrivateAttr for runtime-only state — Pydantic skips it during validation, which keeps the toolset constructible with no args.
When not to use a Python tool
Section titled “When not to use a Python tool”Reach for MCP instead when the implementation is:
- a shell command
- a service in Node, Go, or any non-Python runtime
- a remote API you want to run out-of-process
- a third-party tool with its own lifecycle
That keeps capability tooling split cleanly: Python-native logic via @tool and Toolset, everything else via MCP servers.
Reference
Section titled “Reference”The full @tool, Tool, and Toolset API — including Component, Context injection, and serialization details — lives at dreadnode.tools.