Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.dbdock.xyz/llms.txt

Use this file to discover all available pages before exploring further.

npx dbdock migrate <source-url> <target-url> [options]
migrate is the main cross-database migration command. DBdock analyzes the source, generates a schema mapping, shows it to you, and waits for confirmation before touching anything.
Run dbdock analyze on the source first. It’s read-only and tells you what you’re about to migrate.

Examples

MongoDB → PostgreSQL

npx dbdock migrate \
  "mongodb://localhost:27017/myapp" \
  "postgresql://user:pass@localhost:5432/myapp"

PostgreSQL → MongoDB

npx dbdock migrate \
  "postgresql://user:pass@localhost:5432/myapp" \
  "mongodb://localhost:27017/myapp"

Options

OptionDescription
--dry-runRun against a temporary schema/collection prefix for validation
--incrementalOnly migrate new/changed data (needs --since)
--since <date>Cutoff date for incremental (ISO format)
--config <path>Use a saved migration config file
--export-config <path>Export the generated plan to a config file
--batch-size <number>Documents per batch (default 1000)
--max-depth <number>Max nesting depth before jsonb (default 2)

The confirmation flow

Analyzing source...
  Collections: 4
  Documents:   65,360
  Size:        142.8 MB

Generating schema mapping...

Proposed mapping:
  users      (12,450 docs)  →  public.users      (8 columns)
  orders     (48,910 docs)  →  public.orders     (12 columns + 2 fk)
  products   (820 docs)     →  public.products   (14 columns)
  reviews    (3,180 docs)   →  public.reviews    (6 columns + 1 fk)

Estimated duration: 3-5 minutes

? Proceed with migration? (y/N)

What happens under the hood

1

Connect to both databases

Validates credentials and reachability.
2

Analyze source

Same logic as dbdock analyze — types, nesting, inconsistencies.
3

Generate mapping

Proposes target schema based on source shape and --max-depth.
4

Detect references

Scans for fields that look like references between collections/tables.
5

Show plan and wait

Nothing is written yet. You see the full plan and confirm.
6

Create target schema

Creates tables/collections. Idempotent — skips existing ones.
7

Migrate in batches

Streams data in configurable batches. Progress bar shows rate and ETA.
8

Collect errors

Failed rows go to _migration_errors with the error message.
9

Report

Summary: rows migrated, rows failed, duration.

Reusing a mapping

Generate and save the plan once:
npx dbdock migrate "$MONGO" "$PG" --export-config ./my-migration.json
Edit the file to customize the schema. Then:
npx dbdock migrate "$MONGO" "$PG" --config ./my-migration.json
Useful when you want the same migration to run in CI or against multiple environments.

Tuning

Large collections

If a collection has millions of documents, lower the batch size to reduce memory and increase the commit frequency:
npx dbdock migrate "$MONGO" "$PG" --batch-size 500

Deeply nested documents

If your documents have complex nesting you don’t want flattened, increase --max-depth — but past depth 2 or 3, you’re usually better off with jsonb:
npx dbdock migrate "$MONGO" "$PG" --max-depth 1
Anything beyond depth 1 stays as jsonb — usually the right default.

See also

Dry runs

Validate before committing.

Schema mapping

Type conversion details.

Incremental

Pull only new/changed data.