npx dbdock test. It tests your database, storage, and alert config in one go.
Configuration Issues
Secrets in Config File
If you have secrets (passwords, keys) stored directly indbdock.config.json, migrate them to environment variables:
- Extract all secrets from your config file
- Create/update
.envwith the secrets - Remove secrets from
dbdock.config.json - Update
.gitignoreautomatically
Missing Environment Variables
DBDock requires certain environment variables to be set. Ensure you have one of:- Database: Either
DBDOCK_DB_URLorDATABASE_URL(full URL, e.g.postgresql://user:pass@host:5432/db), orDBDOCK_DB_PASSWORDwith host/port/user/database in config or env (DB_HOST,DB_PORT,DB_USER,DB_NAME) DBDOCK_STORAGE_ACCESS_KEYandDBDOCK_STORAGE_SECRET_KEY- For cloud storageDBDOCK_ENCRYPTION_SECRET- If encryption is enabled
.env file exists and contains the required variables.
Common Issues
pg_dump / pg_restore / psql not found
You need PostgreSQL client tools installed.Database connection errors
Can’t connect to database?- Double-check
host,port,username,password,databasein config (or useDBDOCK_DB_URL/DATABASE_URL) - Test manually:
- Make sure the PostgreSQL server is actually running
- Check firewalls / security groups if it’s a remote database
- Use .pgpass — For enhanced security, use
.pgpassinstead of env vars:
Storage Errors
AWS S3
Verify credentials
Ensure
DBDOCK_STORAGE_ACCESS_KEY and DBDOCK_STORAGE_SECRET_KEY are set in your .env file and are correctCheck IAM permissions
Your IAM user must have these permissions:
s3:PutObject- Upload backupss3:GetObject- Download backupss3:ListBucket- List available backupss3:DeleteObject- Delete old backups
Cloudflare R2
- Verify credentials - Ensure
DBDOCK_STORAGE_ACCESS_KEYandDBDOCK_STORAGE_SECRET_KEYare set in your.envfile - Check endpoint URL - Must be in format:
https://ACCOUNT_ID.r2.cloudflarestorage.com - Bucket access - Ensure bucket exists and is accessible
- Permissions - Verify R2 credentials have read/write permissions
- Restore issues - Ensure backups are in
dbdock_backups/folder with.sqlextension
Cloudinary
- Verify credentials - Ensure
DBDOCK_CLOUDINARY_API_KEYandDBDOCK_CLOUDINARY_API_SECRETare set in your.envfile - Check cloud name - Verify cloud name is correct in
dbdock.config.json - Account status - Ensure your Cloudinary account is active
- API access - Verify API credentials have media library access permissions
Encryption Key Errors
Encryption keys must be exactly 64 hexadecimal characters (0-9, a-f, A-F). Generate a valid key:.env file as DBDOCK_ENCRYPTION_SECRET:
Store your encryption key securely. If you lose it, you won’t be able to restore encrypted backups.
No Backups Found
If DBDock can’t find your backups:Local Storage
- Check files exist in configured path
- Verify file permissions allow reading
- Ensure files match pattern:
backup-*.sql
S3/R2
- Verify files are in
dbdock_backups/folder - Check bucket name and region are correct
- Ensure files match pattern:
backup-*.sql
Cloudinary
- Check Media Library for
dbdock_backupsfolder - Verify cloud name is correct
- Ensure files match pattern:
backup-*.sql
File Naming
All backups must follow the naming pattern:
backup-YYYY-MM-DD-HH-MM-SS-BACKUPID.sqlGetting Help
DBDock provides clear, actionable error messages for most issues. If you’re still experiencing problems:- 💬 Ask in Discussions - Get help from the community
- 🐛 Report an Issue - Report bugs or request features
- 📚 Read the Docs - Browse the full documentation
