Skip to main content

DBDock

Stop writing backup scripts. Stop losing sleep over database migrations. DBDock handles PostgreSQL backups, restores, database copies, and cross-database migrations between MongoDB and PostgreSQL in one command. npm version License: MIT Node.js Documentation 📚 Full Documentation | 💬 Discussions | 🐛 Issues

The Problem

Every time you need to backup a database, copy it to staging, or restore before a migration — it’s the same boring steps. Connect, dump, upload, move files around, remember the right flags. It’s not hard. It’s just repetitive. And repetitive stuff should be one command.

The Fix

npx dbdock init                              # One-time setup (takes 30 seconds)
npx dbdock backup                            # Backup with encryption + compression
npx dbdock restore                            # Restore from any backup
npx dbdock copydb "db_url_1" "db_url_2"      # Copy entire database, zero config
npx dbdock migrate "mongo_url" "postgres_url" # Cross-database migration
That’s it. No shell scripts. No manual uploads. No throwaway migration code.

Quick Start

npx dbdock init      # Interactive setup
npx dbdock backup    # Create backup
npx dbdock restore   # Restore backup

Features

  • Beautiful CLI - Real-time progress bars, speed tracking, smart filtering
  • Multiple Storage - Local, AWS S3, Cloudflare R2, Cloudinary
  • Security First - Hybrid config (env vars for secrets, or full URL via DBDOCK_DB_URL), AES-256 encryption, credential masking, .pgpass support
  • Retention Policies - Automatic cleanup by count/age with safety nets
  • Smart UX - Intelligent filtering for 100+ backups, clear error messages
  • Alerts - Email (SMTP) and Slack notifications for backups (CLI & Programmatic)
  • TypeScript Native - Full type safety for programmatic usage
  • Automation - Cron schedules, auto-cleanup after backups
  • Migration Tool - One command to migrate legacy configs to secure env vars
  • Cross-Database - MongoDB ↔ PostgreSQL migration with schema mapping and dry run

Installation

Global Installation (Recommended):
npm install -g dbdock

dbdock init      # Use directly
dbdock backup
dbdock status
Or use with npx (No installation needed):
npx dbdock init
npx dbdock backup
npx dbdock status

CLI Commands

dbdock init — Set up in 30 seconds

Run once. It walks you through everything interactively:
npx dbdock init
It asks for your database connection, picks your storage (Local, S3, R2, Cloudinary), sets up encryption if you want it, and optionally configures Slack/Email alerts. What happens under the hood:
  • Config (safe stuff) goes to dbdock.config.json — commit this
  • Secrets go to .env — never committed, .gitignore updated automatically
You can also run without a config file: set DBDOCK_DB_URL (or DATABASE_URL) and other env vars and DBDock will use env-only configuration.

dbdock migrate-config — Fix legacy configs

npx dbdock migrate-config
Got secrets sitting in dbdock.config.json from an older version? This extracts them to .env, cleans up your config, and updates .gitignore. One command, done.

npx dbdock backup — One command, full backup

npx dbdock backup
Real-time progress. Streams directly to your storage provider. Done.
████████████████████ | 100% | 45.23 MB | Speed: 12.50 MB/s | Uploading to S3
✔ Backup completed successfully
Options:
npx dbdock backup --encrypt --compress --compression-level 9
  • --encrypt / --no-encrypt - Toggle encryption
  • --compress / --no-compress - Toggle compression
  • --encryption-key <key> - 64-char hex key (must be exactly 64 hexadecimal characters)
  • --compression-level <1-11> - Compression level (default: 6)
Generate encryption key:
node -e "console.log(require('crypto').randomBytes(32).toString('hex'))"
Backup Formats:
  • custom (default) - PostgreSQL custom binary format (.sql)
  • plain - Plain SQL text format (.sql)
  • directory - Directory format (.dir)
  • tar - Tar archive format (.tar)

npx dbdock restore — Interactive restore with smart filtering

Progress:
────────────────────────────────────────────────────────
  ✔ Downloading backup
  ✔ Decrypting data
  ✔ Decompressing data
  ⟳ Restoring to database...
────────────────────────────────────────────────────────
✔ All steps completed in 8.42s
Got 200+ backups? It auto-enables smart filtering — search by date, keyword, or just grab the most recent ones. No scrolling through walls of text. Migration Support: You can choose to restore to a New Database Instance during the restore process. This is perfect for migrating data between servers (e.g., from staging to production or local to cloud).
  1. Run npx dbdock restore
  2. Select a backup
  3. Choose “New Database Instance (Migrate)”
  4. Enter connection details for the target database
Shows database stats and requires confirmation before restore.

npx dbdock copydb — Copy a database with just two URLs

This is the one people love. No config files. No setup. Just paste two PostgreSQL URLs:
npx dbdock copydb "postgresql://user:pass@source:5432/mydb" "postgresql://user:pass@target:5432/mydb"
It tests both connections, shows you the source DB size and table count, warns you if the target has existing data, and asks for confirmation before doing anything. Then it streams pg_dump directly into pg_restore — no temp files, no waiting. Options: --schema-only (tables, indexes, constraints — no data), --data-only, --verbose Environment consolidation:
npx dbdock copydb "prod_url" "staging_url"
npx dbdock copydb "staging_url" "prod_url"
npx dbdock copydb "prod_url" "postgresql://postgres:pass@localhost:5432/myapp"
npx dbdock copydb --schema-only "prod_url" "staging_url"
Perfect for moving between Neon, Supabase, Railway, RDS, or any Postgres host.

npx dbdock list — See all your backups

npx dbdock list                  # Everything
npx dbdock list --recent 10      # Last 10
npx dbdock list --search keyword # Find specific backup
npx dbdock list --days 7         # Last 7 days
Auto-filters when you have 50+ backups so the output stays clean.

npx dbdock delete — Remove backups

npx dbdock delete              # Interactive picker
npx dbdock delete --key <id>   # Delete specific backup
npx dbdock delete --all        # Nuke everything (with confirmation)

npx dbdock cleanup — Auto-clean old backups

npx dbdock cleanup              # Interactive with preview
npx dbdock cleanup --dry-run    # See what would be deleted
npx dbdock cleanup --force      # Skip confirmation
Shows you exactly what gets deleted and how much space you reclaim before doing anything.

dbdock status — Check schedules and service health

dbdock status
Output:
📅 Scheduled Backups:

┌─────┬──────────────┬─────────────────┬──────────┐
│  #  │ Name         │ Cron Expression │ Status   │
├─────┼──────────────┼─────────────────┼──────────┤
│   1 │ daily        │ 0 * * * *       │ ✓ Active │
│   2 │ weekly       │ 0 0 * * 0       │ ✗ Paused │
└─────┴──────────────┴─────────────────┴──────────┘

Total: 2 schedule(s) - 1 active, 1 paused

🚀 Service Status:

🟢 Running (PM2)
  PID: 12345
  Uptime: 2d 5h
  Memory: 45.23 MB

dbdock schedule — Manage cron schedules

dbdock schedule
Add, remove, or toggle backup schedules. Comes with presets (hourly, daily at midnight, daily at 2 AM, weekly, monthly) or use a custom cron expression. Heads up: Schedules only run when DBDock is integrated into your Node.js app (see Programmatic Usage below). The CLI just manages the config.

dbdock test — Verify everything works

npx dbdock test
Tests your database connection, storage provider, and alert config. Run this first if something feels off.

Cross-Database Migration (MongoDB ↔ PostgreSQL)

Move data between MongoDB and PostgreSQL without throwaway scripts. DBDock analyzes the source, proposes a schema mapping, lets you review it, and handles the transfer.
npx dbdock analyze "mongodb://localhost:27017/myapp"
npx dbdock migrate "mongodb://localhost:27017/myapp" "postgresql://user:pass@localhost:5432/myapp"
Supports dry run (--dry-run), incremental sync (--incremental --since <date>), and export/import config (--export-config, --config). Failed rows go to _migration_errors; nothing executes without your confirmation. See the package README or CLI for full options.

Security Best Practices

Secure Configuration Management

DBDock uses a hybrid configuration approach to keep your secrets safe:
  • Non-sensitive settingsdbdock.config.json (safe for version control), or env vars
  • Sensitive secrets → Environment variables (e.g. DBDOCK_DB_URL, DBDOCK_DB_PASSWORD, storage keys — NEVER commit to git)
When you run npx dbdock init, DBDock automatically:
  • Saves credentials to .env (not committed)
  • Saves only non-sensitive config to dbdock.config.json
  • Updates .gitignore to exclude .env
Note: DBDock reads environment variables from both .env and .env.local files (with .env.local taking priority for local overrides). You can use either file depending on your workflow.

Environment Variables

Set these environment variables for secure credential management. You can use a full database URL for env-only config (no dbdock.config.json needed):
# Database: full URL (recommended for env-only) or separate password
DBDOCK_DB_URL=postgresql://user:password@host:5432/database
# or DATABASE_URL=... or DBDOCK_DB_PASSWORD=your-database-password

# Storage credentials (Required for cloud storage)
DBDOCK_STORAGE_ACCESS_KEY=your-access-key
DBDOCK_STORAGE_SECRET_KEY=your-secret-key

# Encryption key (Required if encryption enabled)
# Generate with: openssl rand -hex 32
DBDOCK_ENCRYPTION_SECRET=64-char-hex-string

# Email alerts (Optional)
DBDOCK_SMTP_USER=your-email@example.com
DBDOCK_SMTP_PASS=your-app-password

# Slack alerts (Optional)
DBDOCK_SLACK_WEBHOOK=https://hooks.slack.com/services/...

Migration from Legacy Config

If you have an existing configuration with secrets in dbdock.config.json:
npx dbdock migrate-config
This command will:
  1. Extract all secrets from your config file
  2. Create/update .env with the secrets
  3. Remove secrets from dbdock.config.json
  4. Update .gitignore automatically

Using .pgpass for PostgreSQL

For enhanced security, use .pgpass instead of environment variables:
# Create the file
touch ~/.pgpass
chmod 600 ~/.pgpass

# Add your connection (format: host:port:database:username:password)
echo "localhost:5432:myapp:postgres:my-secure-password" >> ~/.pgpass
DBDock will automatically use .pgpass when available, which is more secure than PGPASSWORD environment variables.

Security Features

  • Automatic credential masking - All passwords and keys are masked in logs
  • File permission checking - Warns about insecure config file permissions
  • Encryption at rest - AES-256-GCM encryption for backups
  • Strict mode - Optional enforcement of env-only secrets (DBDOCK_STRICT_MODE=true)

Configuration

After running npx dbdock init, a dbdock.config.json file is created (without sensitive data):
{
  "_comment": "Secrets (passwords, keys) are set via environment variables",
  "database": {
    "type": "postgres",
    "host": "localhost",
    "port": 5432,
    "username": "postgres",
    "database": "myapp"
  },
  "storage": {
    "provider": "s3",
    "s3": {
      "bucket": "my-backups",
      "region": "us-east-1"
    }
  },
  "backup": {
    "format": "custom",
    "compression": {
      "enabled": true,
      "level": 6
    },
    "encryption": {
      "enabled": true
    },
    "retention": {
      "enabled": true,
      "maxBackups": 100,
      "maxAgeDays": 30,
      "minBackups": 5,
      "runAfterBackup": true
    }
  },
  "alerts": {
    "email": {
      "enabled": true,
      "smtp": {
        "host": "smtp.gmail.com",
        "port": 587,
        "secure": false
      },
      "from": "backups@yourapp.com",
      "to": ["admin@yourapp.com"]
    }
  }
}
Note: SMTP credentials (user, pass) and storage secrets are set via environment variables for security.

Storage Providers

Local:
{ "storage": { "provider": "local", "local": { "path": "./backups" } } }
AWS S3:
{
  "storage": {
    "provider": "s3",
    "s3": {
      "bucket": "my-backups",
      "region": "us-east-1"
    }
  }
}
Set credentials via environment variables:
DBDOCK_STORAGE_ACCESS_KEY=your-access-key
DBDOCK_STORAGE_SECRET_KEY=your-secret-key
Required IAM permissions: s3:PutObject, s3:GetObject, s3:ListBucket, s3:DeleteObject Cloudflare R2:
{
  "storage": {
    "provider": "r2",
    "s3": {
      "bucket": "my-backups",
      "region": "auto",
      "endpoint": "https://ACCOUNT_ID.r2.cloudflarestorage.com"
    }
  }
}
Set credentials via environment variables (same as S3 above). Cloudinary:
{
  "storage": {
    "provider": "cloudinary",
    "cloudinary": {
      "cloudName": "your-cloud"
    }
  }
}
Set credentials via environment variables:
DBDOCK_CLOUDINARY_API_KEY=your-api-key
DBDOCK_CLOUDINARY_API_SECRET=your-api-secret
All cloud backups stored in dbdock_backups/ folder with format: backup-YYYY-MM-DD-HH-MM-SS-BACKUPID.sql

Retention Policy

Backups pile up fast. Retention handles it automatically:
{
  "backup": {
    "retention": {
      "enabled": true,
      "maxBackups": 100,
      "maxAgeDays": 30,
      "minBackups": 5,
      "runAfterBackup": true
    }
  }
}
How it works:
  • Keeps most recent minBackups (safety net, never deleted)
  • Deletes backups exceeding maxBackups limit (oldest first)
  • Deletes backups older than maxAgeDays (respecting minBackups)
  • Runs automatically after each backup (if runAfterBackup: true)
  • Manual cleanup: npx dbdock cleanup
Safety features:
  • Always preserves minBackups most recent backups
  • Shows preview before deletion
  • Detailed logging of what was deleted
  • Error handling for failed deletions

Programmatic Usage

Don’t just use the CLI — drop DBDock into your Node.js app and trigger backups from code. Works with any backend (Express, Fastify, NestJS, whatever). You don’t need to understand NestJS internals; DBDock provides a simple API.

Basic Setup

First, install DBDock:
npm install dbdock
Use dbdock.config.json (run npx dbdock init) or configure entirely via env vars (DBDOCK_DB_URL or DATABASE_URL plus storage and other env vars). DBDock reads from config file and/or environment automatically.

How It Works

DBDock uses a simple initialization pattern:
  1. Call createDBDock() to initialize DBDock (reads from dbdock.config.json and/or env vars)
  2. Get the BackupService from the returned context using .get(BackupService)
  3. Use the service methods to create backups, list backups, etc.
Think of createDBDock() as a factory function that sets up everything from your config and env vars.

Creating Backups

const { createDBDock, BackupService } = require('dbdock');

async function createBackup() {
  const dbdock = await createDBDock();
  const backupService = dbdock.get(BackupService);

  const result = await backupService.createBackup({
    format: 'plain',
    compress: true,
    encrypt: true,
  });

  console.log(`Backup created: ${result.metadata.id}`);
  console.log(`Size: ${result.metadata.formattedSize}`);
  console.log(`Path: ${result.storageKey}`);
  
  return result;
}

createBackup().catch(console.error);
Backup Options:
  • compress - Enable/disable compression (default: from config)
  • encrypt - Enable/disable encryption (default: from config)
  • format - Backup format: 'custom' (default), 'plain', 'directory', 'tar'
  • type - Backup type: 'full' (default), 'schema', 'data'

Listing Backups

const { createDBDock, BackupService } = require('dbdock');

async function listBackups() {
  const dbdock = await createDBDock();
  const backupService = dbdock.get(BackupService);

  const backups = await backupService.listBackups();

  console.log(`Found ${backups.length} backups:`);
  backups.forEach(
    (backup: {
      id: string;
      formattedSize: string;
      startTime: string | Date;
    }) => {
      console.log(
        `- ${backup.id} (${backup.formattedSize}, created: ${backup.startTime})`
      );
    }
  );

  return backups;
}

listBackups().catch(console.error);

Getting Backup Metadata

const { createDBDock, BackupService } = require('dbdock');

async function getBackupInfo(backupId) {
  const dbdock = await createDBDock();
  const backupService = dbdock.get(BackupService);

  const metadata = await backupService.getBackupMetadata(backupId);
  
  if (!metadata) {
    console.log('Backup not found');
    return null;
  }
  
  console.log('Backup details:', {
    id: metadata.id,
    size: metadata.size,
    created: metadata.startTime,
    encrypted: !!metadata.encryption,
    compressed: metadata.compression.enabled,
  });
  
  return metadata;
}

getBackupInfo('your-backup-id').catch(console.error);
Restore is CLI-only for now (npx dbdock restore). Programmatic restore is coming.

Schedule Backups with node-cron

DBDock stays lightweight — no built-in daemon. Use node-cron to schedule: First, install node-cron:
npm install node-cron
npm install --save-dev @types/node-cron
Then create a scheduler script (e.g., scheduler.ts):
import { createDBDock, BackupService } from 'dbdock';
import * as cron from 'node-cron';

async function startScheduler() {
  const dbdock = await createDBDock();
  const backupService = dbdock.get(BackupService);

  console.log('🚀 Backup scheduler started. Running every minute...');

  cron.schedule('* * * * *', async () => {
    try {
      console.log('\n⏳ Starting scheduled backup...');
      
      const result = await backupService.createBackup({
        format: 'plain',
        compress: true,
        encrypt: true,
      });

      console.log(`✅ Backup successful: ${result.metadata.id}`);
      console.log(`📦 Size: ${result.metadata.formattedSize}`);
      console.log(`📂 Path: ${result.storageKey}`);
    } catch (error) {
      console.error('❌ Backup failed:', error);
    }
  });
}

startScheduler().catch(console.error);
Note: The CLI dbdock schedule command manages configuration for external schedulers but does not run a daemon itself. Using node-cron as shown above is the recommended way to run scheduled backups programmatically.

Requirements

  • Node.js 18 or higher
  • PostgreSQL 12+
  • PostgreSQL client tools (pg_dump, pg_restore, psql)
Installing PostgreSQL client tools:
# macOS
brew install postgresql

# Ubuntu/Debian
sudo apt-get install postgresql-client

# Windows
# Download from https://www.postgresql.org/download/windows/

Troubleshooting

First step, always:
npx dbdock test
This tests your database, storage, and alert config in one go.

Common Issues

pg_dump not found:
# macOS
brew install postgresql

# Ubuntu/Debian
sudo apt-get install postgresql-client
Can’t connect to database:
  • Double-check host, port, username, password, database in config (or use DBDOCK_DB_URL / DATABASE_URL)
  • Test manually: psql -h HOST -p PORT -U USERNAME -d DATABASE
  • Make sure the PostgreSQL server is actually running
  • Check firewalls / security groups if it’s a remote database
Storage errors: S3: Check credentials, bucket name, region. IAM user needs s3:PutObject, s3:GetObject, s3:ListBucket, s3:DeleteObject. R2: Check endpoint format (https://ACCOUNT_ID.r2.cloudflarestorage.com), verify API token and bucket exist. Cloudinary: Verify cloud name, API key, API secret. Make sure the account is active. Encryption key issues: Key must be exactly 64 hex characters. Generate a valid one:
node -e "console.log(require('crypto').randomBytes(32).toString('hex'))"
No backups found during restore:
  • Local: Check the configured path has files
  • S3/R2: Files should be in dbdock_backups/ folder
  • Cloudinary: Check Media Library for dbdock_backups folder
  • Files should match: backup-*.sql

Documentation

📚 Full Documentation - Comprehensive guides, API reference, and examples

Support

  • 💬 Discussions - Ask questions and share ideas
  • 🐛 Issues - Report bugs and request features

License

MIT