Documentation Index
Fetch the complete documentation index at: https://mintlify.com/remix-run/remix/llms.txt
Use this file to discover all available pages before exploring further.
data-table includes a first-class migration system with schema builders, runner controls, and dry-run planning.
Setup
Create a migrations directory and runner script:
app/
db/
migrations/
20260228090000_create_users.ts
20260301113000_add_user_status.ts
migrate.ts
- Keep migration files in one directory (e.g.,
app/db/migrations)
- Name each file as
YYYYMMDDHHmmss_name.ts (or .js, .mjs, .cjs, .cts)
- Each file must
default export createMigration(...)
id and name are inferred from the filename
Migration File
import { column as c, table } from 'remix/data-table'
import { createMigration } from 'remix/data-table/migrations'
let users = table({
name: 'users',
columns: {
id: c.integer().primaryKey().autoIncrement(),
email: c.varchar(255).notNull().unique(),
created_at: c.timestamp({ withTimezone: true }).defaultNow(),
},
})
export default createMigration({
async up({ db, schema }) {
await schema.createTable(users)
await schema.createIndex(users, 'email', { unique: true })
if (db.adapter.dialect === 'sqlite') {
await db.exec('pragma foreign_keys = on')
}
},
async down({ schema }) {
await schema.dropTable(users, { ifExists: true })
},
})
Migration Context
The up and down handlers receive a context object:
type MigrationContext = {
db: Database // Data manipulation runtime (query/create/update/exec/transaction)
schema: MigrationSchema // Schema runtime (createTable/alterTable/createIndex/...)
}
Runner Script
import path from 'node:path'
import { Pool } from 'pg'
import { createPostgresDatabaseAdapter } from 'remix/data-table-postgres'
import { createMigrationRunner } from 'remix/data-table/migrations'
import { loadMigrations } from 'remix/data-table/migrations/node'
let directionArg = process.argv[2] ?? 'up'
let direction = directionArg === 'down' ? 'down' : 'up'
let to = process.argv[3]
let pool = new Pool({ connectionString: process.env.DATABASE_URL })
let adapter = createPostgresDatabaseAdapter(pool)
let migrations = await loadMigrations(path.resolve('app/db/migrations'))
let runner = createMigrationRunner(adapter, migrations)
try {
let result = direction === 'up' ? await runner.up({ to }) : await runner.down({ to })
console.log(direction + ' complete', {
applied: result.applied.map((entry) => entry.id),
reverted: result.reverted.map((entry) => entry.id),
})
} finally {
await pool.end()
}
Custom Journal Table
By default, migrations are tracked in a table named data_table_migrations. Customize this:
let runner = createMigrationRunner(adapter, migrations, {
journalTable: 'app_migrations',
})
Running Migrations
# Apply all pending migrations
node ./app/db/migrate.ts up
# Apply migrations up to a specific ID
node ./app/db/migrate.ts up 20260301113000
# Revert all applied migrations
node ./app/db/migrate.ts down
# Revert migrations down to a specific ID
node ./app/db/migrate.ts down 20260228090000
Step-based Migration
Apply or revert a specific number of migrations:
// Apply next 1 migration
await runner.up({ step: 1 })
// Revert last 1 migration
await runner.down({ step: 1 })
to and step are mutually exclusive.
Dry Run
Compile and inspect SQL without applying:
let dryRunResult = await runner.up({ dryRun: true })
console.log(dryRunResult.sql)
Schema API
The schema object provides DDL operations:
Create Table
import { column as c, table } from 'remix/data-table'
let users = table({
name: 'users',
columns: {
id: c.integer().primaryKey().autoIncrement(),
email: c.varchar(255).notNull().unique(),
created_at: c.timestamp({ withTimezone: true }).defaultNow(),
},
})
await schema.createTable(users)
await schema.createTable(users, { ifNotExists: true })
Alter Table
await schema.alterTable(users, (table) => {
// Add column
table.addColumn('status', c.varchar(50).defaultTo('active'))
// Change column
table.changeColumn('email', c.varchar(320).notNull())
// Rename column
table.renameColumn('email', 'email_address')
// Drop column
table.dropColumn('old_field', { ifExists: true })
// Add primary key
table.addPrimaryKey('id')
table.addPrimaryKey(['tenant_id', 'id']) // compound
// Drop primary key
table.dropPrimaryKey('users_pkey')
// Add unique constraint
table.addUnique('email')
table.addUnique(['tenant_id', 'email'], { name: 'users_tenant_email_unique' })
// Drop unique constraint
table.dropUnique('users_email_unique')
// Add foreign key
table.addForeignKey('user_id', 'users', 'id')
table.addForeignKey(
'user_id',
'users',
'id',
{ onDelete: 'cascade', onUpdate: 'cascade' }
)
// Drop foreign key
table.dropForeignKey('orders_user_id_fkey')
// Add check constraint
table.addCheck('total >= 0', { name: 'positive_total' })
// Drop check constraint
table.dropCheck('positive_total')
// Add index
table.addIndex('email', { unique: true })
table.addIndex(['status', 'created_at'])
// Drop index
table.dropIndex('users_email_idx')
// Add comment
table.comment('User accounts table')
})
Rename Table
await schema.renameTable('users', 'app_users')
await schema.renameTable(users, 'app_users')
Drop Table
await schema.dropTable('users')
await schema.dropTable(users, { ifExists: true })
await schema.dropTable(users, { cascade: true })
Create Index
await schema.createIndex(users, 'email', { unique: true })
await schema.createIndex(users, ['status', 'created_at'])
await schema.createIndex(users, 'email', { ifNotExists: true, name: 'users_email_idx' })
Drop Index
await schema.dropIndex(users, 'users_email_idx')
await schema.dropIndex(users, 'users_email_idx', { ifExists: true })
Rename Index
await schema.renameIndex(users, 'old_idx', 'new_idx')
Foreign Keys
// Add foreign key
await schema.addForeignKey(orders, 'user_id', users, 'id')
await schema.addForeignKey(
orders,
'user_id',
users,
'id',
{ onDelete: 'cascade', onUpdate: 'restrict' }
)
// Compound foreign key
await schema.addForeignKey(
order_items,
['tenant_id', 'order_id'],
orders,
['tenant_id', 'id']
)
// Drop foreign key
await schema.dropForeignKey(orders, 'orders_user_id_fkey')
Check Constraints
// Add check
await schema.addCheck(orders, 'total >= 0', { name: 'positive_total' })
// Drop check
await schema.dropCheck(orders, 'positive_total')
Raw SQL in Migrations
import { sql } from 'remix/data-table'
await schema.plan(sql`update users set status = ${'active'} where status is null`)
Schema Introspection
Run defensive checks inside migrations:
// Check if table exists
if (await schema.hasTable('users')) {
await schema.alterTable('users', (table) => {
// ...
})
}
// Check if column exists
if (await schema.hasColumn('users', 'old_field')) {
await schema.alterTable('users', (table) => {
table.dropColumn('old_field')
})
}
In dryRun mode, introspection methods check the live database state, not the pending plan.
Constraint Naming
Constraint and index names are optional. When omitted, data-table generates deterministic names:
// Explicit name
table.addUnique('email', { name: 'users_email_unique' })
// Generated name (e.g., "users_email_key")
table.addUnique('email')
Transaction Mode
Control migration transaction behavior:
export default createMigration({
async up({ schema }) {
await schema.createTable(users)
},
async down({ schema }) {
await schema.dropTable(users)
},
transaction: 'required', // 'auto' | 'required' | 'none'
})
'auto' (default) - Use transactions when adapter supports them
'required' - Fail if adapter doesn’t support transactions
'none' - Never use transactions
Non-Filesystem Runtimes
For environments without filesystem access, register migrations manually:
import { createMigrationRegistry, createMigrationRunner } from 'remix/data-table/migrations'
import createUsers from './db/migrations/20260228090000_create_users.ts'
let registry = createMigrationRegistry()
registry.register({ id: '20260228090000', name: 'create_users', migration: createUsers })
let runner = createMigrationRunner(adapter, registry)
await runner.up()
Migration Status
Check migration status:
let status = await runner.status()
for (let entry of status) {
console.log(entry)
// { id: '20260228090000', name: 'create_users', status: 'applied', appliedAt: Date, batch: 1 }
// { id: '20260301113000', name: 'add_user_status', status: 'pending' }
}
Status values:
'applied' - Migration has been applied
'pending' - Migration has not been applied
'drifted' - Migration checksum doesn’t match (file changed after applying)
Type Exports
import type {
Migration,
MigrationContext,
MigrationRunner,
MigrationSchema,
MigrationStatusEntry,
AlterTableBuilder,
} from 'remix/data-table/migrations'