Remote Databases
You can configure the platform to use external database servers by updating the persistent environment configuration.- PostgreSQL
- ClickHouse
To use a remote Postgres database (e.g., AWS RDS, Google Cloud SQL):Requirement: Ensure your Postgres server allows connections from the platform host.
Backup & Restore
Logical backups (pg_dump, ClickHouse BACKUP) are recommended over volume snapshots for consistency.
Postgres Backup
Postgres Backup
Run
pg_dump from a temporary container against your database:ClickHouse Backup
ClickHouse Backup
For ClickHouse, use the native
BACKUP command to S3 or compatible storage:Air-Gapped Deployment
For environments without internet access, you must manually transfer the Docker images and configuration templates.1. Prepare (On an Online Machine)
- Install the SDK:
pip install dreadnode - Authenticate:
dreadnode login - Download Templates:
- Pull Images:
Manually pull the images listed in
~/.dreadnode/platform/<tag>/docker-compose.yaml. - Save Images:
2. Transfer & Load (On the Air-Gapped Machine)
- Transfer the
~/.dreadnode/platformdirectory and the.tar.gzimage files to the target machine. - Load the images:
- Start the platform (it will detect the existing templates and images):
Hybrid Deployment Example
For a resilient production deployment, a hybrid approach is often best:- Compute: Run the Dreadnode API & UI containers on your own compute instances.
- Data: Connect to managed cloud databases (RDS, ClickHouse Cloud).
- Artifacts: Store large artifacts in S3 (configure
S3_AWS_EXTERNAL_ENDPOINT_URL).

