Command Style (Use PATH)
All examples use the plain command name dbcli (no directory prefix).
Ensure dbcli is on PATH instead of hardcoding paths like .\.claude\skills\dbcli\dbcli.exe.
DbCli Export Skill
Export table data as SQL INSERT statements - essential for backup before dangerous operations.
When to Use This Skill
- MANDATORY before UPDATE/DELETE/DROP operations - Create backups first
- User wants to backup table data
- User needs to migrate data between databases
- User wants portable SQL dump for version control
- User needs to copy table data to another environment
- Creating disaster recovery backups
⚠️ Safety-Critical Use Cases
This skill is REQUIRED before:
- UPDATE operations - Backup data before modifying
- DELETE operations - Backup data before removing
- DROP TABLE - Backup before destroying table
- ALTER TABLE - Backup before structure changes
- Bulk modifications - Backup before mass updates
Command Syntax
export DBCLI_CONNECTION="CONNECTION_STRING"
export DBCLI_DBTYPE="DATABASE_TYPE" # Optional, default: sqlite
dbcli export TABLE_NAME > output.sql
Global Options
- Environment variable
DBCLI_CONNECTION: Database connection string (required) - Environment variable
DBCLI_DBTYPE: Database type (default: sqlite)
Basic Export
Single Table Export
# SQLite - Export Users table
export DBCLI_CONNECTION="Data Source=app.db"
dbcli export Users > Users_backup.sql
# Output file contains:
# INSERT INTO Users (Id, Name, Email) VALUES (1, 'Alice', 'alice@example.com');
# INSERT INTO Users (Id, Name, Email) VALUES (2, 'Bob', 'bob@example.com');
With Timestamp
# Create timestamped backup
export DBCLI_CONNECTION="Data Source=app.db"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
dbcli export Users > Users_backup_${TIMESTAMP}.sql
echo "Backup created: Users_backup_${TIMESTAMP}.sql"
Different Databases
# SQL Server
export DBCLI_CONNECTION="Server=localhost;Database=mydb;Trusted_Connection=True"
export DBCLI_DBTYPE="sqlserver"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
dbcli export Users > Users_backup_${TIMESTAMP}.sql
# MySQL
export DBCLI_CONNECTION="Server=localhost;Database=mydb;Uid=root;Pwd=xxxxxxxxxx"
export DBCLI_DBTYPE="mysql"
dbcli export Customers > Customers_backup_${TIMESTAMP}.sql
# PostgreSQL
export DBCLI_CONNECTION="Host=localhost;Database=mydb;Username=postgres;Password=xxxxxxxxxx"
export DBCLI_DBTYPE="postgresql"
dbcli export Orders > Orders_backup_${TIMESTAMP}.sql
Mandatory Backup Before Dangerous Operations
Before UPDATE - Export Backup Workflow
#!/bin/bash
# Safe UPDATE workflow with mandatory backup
TABLE="Users"
CONNECTION="Data Source=app.db"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="${TABLE}_backup_${TIMESTAMP}.sql"
# STEP 1: MANDATORY BACKUP
echo "Creating mandatory backup before UPDATE..."
export DBCLI_CONNECTION="$CONNECTION"
dbcli export $TABLE > $BACKUP_FILE
# Verify backup created
if [ ! -f "$BACKUP_FILE" ]; then
echo "ERROR: Backup failed! Aborting UPDATE."
exit 1
fi
BACKUP_SIZE=$(wc -l < "$BACKUP_FILE")
echo "Backup created: $BACKUP_FILE ($BACKUP_SIZE lines)"
# STEP 2: Confirm with user
read -p "Backup complete. Proceed with UPDATE? (yes/no): " confirm
if [ "$confirm" != "yes" ]; then
echo "UPDATE cancelled by user"
exit 0
fi
# STEP 3: Execute UPDATE
echo "Executing UPDATE..."
dbcli exec "UPDATE Users SET status = 'verified' WHERE email_confirmed = 1"
echo "UPDATE complete. Backup saved: $BACKUP_FILE"
Before DELETE - Export Affected Records
#!/bin/bash
# Safe DELETE workflow with selective backup
TABLE="Users"
CONNECTION="Data Source=app.db"
WHERE_CLAUSE="last_login < date('now', '-365 days')"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
export DBCLI_CONNECTION="$CONNECTION"
# STEP 1: Preview what will be deleted
echo "Records to be deleted:"
dbcli -f table query "SELECT * FROM $TABLE WHERE $WHERE_CLAUSE"
# Count affected records
COUNT=$(dbcli query "SELECT COUNT(*) as count FROM $TABLE WHERE $WHERE_CLAUSE" | jq -r '.[0].count')
echo "Total records to delete: $COUNT"
# STEP 2: MANDATORY BACKUP of affected records
if [ "$COUNT" -gt 0 ]; then
BACKUP_FILE="${TABLE}_deleted_${TIMESTAMP}.sql"
echo "Creating backup of records to be deleted..."
# Export full table (safest approach)
dbcli export $TABLE > $BACKUP_FILE
echo "Backup created: $BACKUP_FILE"
fi
# STEP 3: Confirm deletion
read -p "Delete $COUNT records? (yes/no): " confirm
if [ "$confirm" != "yes" ]; then
echo "DELETE cancelled"
exit 0
fi
# STEP 4: Execute DELETE
dbcli exec "DELETE FROM $TABLE WHERE $WHERE_CLAUSE"
echo "Deleted $COUNT records. Backup: $BACKUP_FILE"
Before DROP TABLE - Full Export
#!/bin/bash
# Safe DROP TABLE workflow with complete backup
TABLE="OldTable"
CONNECTION="Data Source=app.db"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="backups"
mkdir -p $BACKUP_DIR
export DBCLI_CONNECTION="$CONNECTION"
# STEP 1: Export table schema
echo "Exporting table schema..."
dbcli -f table columns $TABLE > "${BACKUP_DIR}/${TABLE}_schema_${TIMESTAMP}.txt"
# STEP 2: MANDATORY data export
echo "Exporting table data..."
dbcli export $TABLE > "${BACKUP_DIR}/${TABLE}_data_${TIMESTAMP}.sql"
# STEP 3: Create table copy (fastest recovery option)
echo "Creating table copy..."
dbcli query "CREATE TABLE ${TABLE}_copy_${TIMESTAMP} AS SELECT * FROM $TABLE"
# Verify backups
DATA_LINES=$(wc -l < "${BACKUP_DIR}/${TABLE}_data_${TIMESTAMP}.sql")
COPY_COUNT=$(dbcli query "SELECT COUNT(*) as count FROM ${TABLE}_copy_${TIMESTAMP}" | jq -r '.[0].count')
echo "Backups created:"
echo " - Schema: ${BACKUP_DIR}/${TABLE}_schema_${TIMESTAMP}.txt"
echo " - Data: ${BACKUP_DIR}/${TABLE}_data_${TIMESTAMP}.sql ($DATA_LINES lines)"
echo " - Table copy: ${TABLE}_copy_${TIMESTAMP} ($COPY_COUNT rows)"
# STEP 4: Confirm DROP
read -p "All backups created. DROP TABLE $TABLE? (type 'DROP' to confirm): " confirm
if [ "$confirm" != "DROP" ]; then
echo "DROP TABLE cancelled"
exit 0
fi
# STEP 5: Execute DROP
echo "Dropping table..."
dbcli ddl "DROP TABLE $TABLE"
echo "Table dropped. Recovery files available in $BACKUP_DIR/"
Export All Tables (Database Backup)
#!/bin/bash
# Export all tables in database
CONNECTION="Data Source=app.db"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="backup_${TIMESTAMP}"
mkdir -p $BACKUP_DIR
export DBCLI_CONNECTION="$CONNECTION"
echo "Exporting all tables..."
# Get list of tables
dbcli tables | jq -r '.[].TableName' | while read table; do
echo " Exporting $table..."
dbcli export $table > "${BACKUP_DIR}/${table}.sql"
done
# Create archive
tar -czf "backup_${TIMESTAMP}.tar.gz" $BACKUP_DIR
echo "Backup complete: backup_${TIMESTAMP}.tar.gz"
Restore from Export
Restore Single Table
# Drop and recreate table, then import backup
export DBCLI_CONNECTION="Data Source=app.db"
dbcli ddl "DROP TABLE IF EXISTS Users"
dbcli ddl -F Users_schema.sql # Create table structure
dbcli exec -F Users_backup_20250127_143022.sql
echo "Table restored from backup"
Restore Specific Records
export DBCLI_CONNECTION="Data Source=app.db"
# Restore only specific records from backup
grep "WHERE Id IN (1, 2, 3)" Users_backup_20250127_143022.sql | \
dbcli exec -F -
# Or manually edit SQL file to restore selective records
Cross-Database Migration
# Export from MySQL
export DBCLI_CONNECTION="Server=source;Database=mydb;Uid=root;Pwd=xxxxxxxxxx"
export DBCLI_DBTYPE="mysql"
dbcli export Users > Users_export.sql
# Import to PostgreSQL (may need SQL syntax adjustments)
export DBCLI_CONNECTION="Host=target;Database=mydb;Username=postgres;Password=xxxxxxxxxx"
export DBCLI_DBTYPE="postgresql"
dbcli exec -F Users_export.sql
DaMeng
export DBCLI_CONNECTION="Server=localhost;User Id=SYSDBA;PWD=xxxxxxxxxx;DATABASE=mydb"
export DBCLI_DBTYPE="dm"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
# Export table
dbcli export dm_test > dm_test_backup_${TIMESTAMP}.sql
# Before UPDATE
echo "Creating backup before UPDATE..."
dbcli export dm_test > dm_test_backup_${TIMESTAMP}.sql
read -p "Backup complete. Continue with UPDATE? (yes/no): " confirm
if [ "$confirm" = "yes" ]; then
dbcli exec "UPDATE dm_test SET status = 1"
fi
GaussDB
export DBCLI_CONNECTION="Host=localhost;Port=8000;Database=mydb;Username=gaussdb;Password=xxxxxxxxxx"
export DBCLI_DBTYPE="gaussdb"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
# Export with timestamp
dbcli export gauss_test > gauss_test_backup_${TIMESTAMP}.sql
# Safe DELETE workflow
echo "Creating backup before DELETE..."
dbcli export gauss_test > gauss_test_backup_${TIMESTAMP}.sql
echo "Backup: gauss_test_backup_${TIMESTAMP}.sql"
read -p "Proceed with DELETE? (yes/no): " confirm
[ "$confirm" = "yes" ] && dbcli exec "DELETE FROM gauss_test WHERE inactive = 1"
Programmatic Export with Safety
Python - Safe Modification Function
import subprocess
import json
from datetime import datetime
import os
def safe_modify_table(connection, table, modify_sql, db_type='sqlite'):
"""Execute modification with automatic backup"""
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
backup_dir = 'backups'
os.makedirs(backup_dir, exist_ok=True)
backup_file = f"{backup_dir}/{table}_backup_{timestamp}.sql"
# STEP 1: MANDATORY BACKUP
print(f"Creating backup: {backup_file}")
# Set environment variables
os.environ['DBCLI_CONNECTION'] = connection
os.environ['DBCLI_DBTYPE'] = db_type
export_cmd = ['dbcli', 'export', table]
with open(backup_file, 'w', encoding='utf-8') as f:
result = subprocess.run(export_cmd, stdout=f, text=True)
if result.returncode != 0:
raise Exception("Backup failed! Aborting modification.")
# Verify backup file created
if not os.path.exists(backup_file):
raise Exception("Backup file not created!")
backup_size = os.path.getsize(backup_file)
print(f"Backup created: {backup_size} bytes")
# STEP 2: Prompt user
confirm = input(f"Backup complete. Execute modification? (yes/no): ")
if confirm.lower() != 'yes':
print("Modification cancelled by user")
return None
# STEP 3: Execute modification
print("Executing modification...")
exec_cmd = ['dbcli', 'exec', modify_sql]
result = subprocess.run(exec_cmd, capture_output=True, text=True)
if result.returncode != 0:
print(f"Modification failed: {result.stderr}")
print(f"Backup available: {backup_file}")
return None
data = json.loads(result.stdout)
print(f"Modified {data['AffectedRows']} rows")
print(f"Backup saved: {backup_file}")
return backup_file
# Usage
backup = safe_modify_table(
connection='Data Source=app.db',
table='Users',
modify_sql="UPDATE Users SET verified = 1 WHERE email_confirmed = 1"
)
PowerShell - Backup Before Delete
function Remove-TableDataSafely {
param(
[string]$Connection,
[string]$Table,
[string]$WhereClause,
[string]$DbType = 'sqlite'
)
$timestamp = Get-Date -Format "yyyyMMdd_HHmmss"
$backupFile = "${Table}_backup_${timestamp}.sql"
# Set environment variables
$env:DBCLI_CONNECTION = $Connection
$env:DBCLI_DBTYPE = $DbType
# Preview deletion
$previewSql = "SELECT * FROM $Table WHERE $WhereClause"
Write-Host "Records to be deleted:"
dbcli -f table query $previewSql
$countSql = "SELECT COUNT(*) as count FROM $Table WHERE $WhereClause"
$count = (dbcli query $countSql | ConvertFrom-Json)[0].count
Write-Host "Total records to delete: $count"
# MANDATORY BACKUP
Write-Host "Creating backup..."
dbcli export $Table > $backupFile
if (-not (Test-Path $backupFile)) {
Write-Error "Backup failed! Aborting deletion."
return
}
Write-Host "Backup created: $backupFile"
# Confirm deletion
$confirm = Read-Host "Delete $count records? (yes/no)"
if ($confirm -ne 'yes') {
Write-Host "Deletion cancelled"
return
}
# Execute DELETE
$deleteSql = "DELETE FROM $Table WHERE $WhereClause"
$result = dbcli exec $deleteSql | ConvertFrom-Json
Write-Host "Deleted $($result.AffectedRows) rows"
Write-Host "Backup: $backupFile"
}
# Usage
Remove-TableDataSafely -Connection "Data Source=app.db" `
-Table "Users" `
-WhereClause "active = 0"
Backup Verification
# Verify backup completeness
TABLE="Users"
BACKUP="Users_backup_20250127_143022.sql"
export DBCLI_CONNECTION="Data Source=app.db"
# Count records in original table
ORIGINAL_COUNT=$(dbcli query "SELECT COUNT(*) as count FROM $TABLE" | jq -r '.[0].count')
# Count INSERT statements in backup
BACKUP_COUNT=$(grep -c "^INSERT INTO" $BACKUP)
echo "Original table: $ORIGINAL_COUNT records"
echo "Backup file: $BACKUP_COUNT INSERT statements"
if [ "$ORIGINAL_COUNT" -eq "$BACKUP_COUNT" ]; then
echo "Backup verified - counts match"
else
echo "WARNING: Backup incomplete! Counts don't match!"
fi
Automated Backup Schedule
#!/bin/bash
# daily_backup.sh - Schedule with cron
CONNECTION="Data Source=production.db"
BACKUP_DIR="/backups/database"
RETENTION_DAYS=30
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
mkdir -p $BACKUP_DIR
export DBCLI_CONNECTION="$CONNECTION"
# Export all tables
dbcli tables | jq -r '.[].TableName' | while read table; do
dbcli export $table > "${BACKUP_DIR}/${table}_${TIMESTAMP}.sql"
done
# Compress backups
tar -czf "${BACKUP_DIR}/full_backup_${TIMESTAMP}.tar.gz" ${BACKUP_DIR}/*_${TIMESTAMP}.sql
rm ${BACKUP_DIR}/*_${TIMESTAMP}.sql
# Delete old backups
find $BACKUP_DIR -name "full_backup_*.tar.gz" -mtime +$RETENTION_DAYS -delete
echo "Backup complete: full_backup_${TIMESTAMP}.tar.gz"
Best Practices
- ALWAYS export before dangerous operations - UPDATE, DELETE, DROP
- Use timestamps in backup filenames for version control
- Verify backups immediately after creation
- Compress large backups to save disk space
- Store backups off-server for disaster recovery
- Test restore procedures regularly
- Document backup locations for team members
- Automate regular backups with cron/scheduled tasks
Common Patterns
Pre-Modification Checklist
#!/bin/bash
# pre_modify_checklist.sh
TABLE="$1"
CONNECTION="Data Source=app.db"
export DBCLI_CONNECTION="$CONNECTION"
echo "=== Pre-Modification Safety Checklist ==="
echo
# 1. Export current data
echo "[1/4] Creating backup..."
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
dbcli export $TABLE > "${TABLE}_backup_${TIMESTAMP}.sql"
echo " Backup: ${TABLE}_backup_${TIMESTAMP}.sql"
# 2. Count records
COUNT=$(dbcli query "SELECT COUNT(*) FROM $TABLE" | jq -r '.[0].count')
echo "[2/4] Record count: $COUNT"
# 3. Check table structure
echo "[3/4] Table structure:"
dbcli -f table columns $TABLE
# 4. Create table copy
COPY_TABLE="${TABLE}_copy_${TIMESTAMP}"
dbcli query "CREATE TABLE $COPY_TABLE AS SELECT * FROM $TABLE"
echo "[4/4] Table copy created: $COPY_TABLE"
echo
echo "=== Checklist Complete ==="
echo "Safe to proceed with modifications"