CLI Advanced Usage
Patterns for scripting, CI/CD pipelines, and bulk operations with the CLI.
Scripting best practices
Start every automation script with strict error handling:
#!/bin/bash
set -euo pipefail
LOG_FILE="cakemail-automation.log"
log() { echo "[$(date +'%Y-%m-%d %H:%M:%S')] $*" | tee -a "$LOG_FILE"; }
Retry with backoff
retry() {
local max=3 attempt=1 delay=2
while [ $attempt -le $max ]; do
"$@" && return 0
[ $attempt -lt $max ] && { log "Retry $attempt..."; sleep $delay; delay=$((delay * 2)); }
((attempt++))
done
return 1
}
retry cakemail campaigns schedule 123
Batch processing with rate limiting
# Process 10 contacts at a time with 1-second delay
cakemail contacts list --list-id 123 --format json --batch \
| jq -r '.data[].id' \
| while read -r id; do
cakemail contacts get --list-id 123 "$id" --format json >> results.json
sleep 0.1
done
Capture and check results
RESPONSE=$(cakemail -f json campaigns get 123 2>&1)
if echo "$RESPONSE" | jq -e . > /dev/null 2>&1; then
STATUS=$(echo "$RESPONSE" | jq -r '.data.status')
echo "Campaign status: $STATUS"
else
echo "Error: $RESPONSE"
exit 1
fi
Contact import and export
CSV requirements
- Header row required with
emailcolumn (mandatory) - UTF-8 encoding
- Dates in ISO format:
YYYY-MM-DD
# Import contacts
cakemail contacts import --list-id 123 --file contacts.csv
# Check import progress
cakemail contacts import-status imp_abc123
# Export contacts
cakemail contacts export --list-id 123
cakemail contacts export-download exp_xyz789 > backup.csv
# Export only subscribed contacts
cakemail contacts export --list-id 123 --filter "status==subscribed"
Data migration between lists
# Export from source list
cakemail contacts export --list-id 100
# Wait for export, then download
cakemail contacts export-download exp_xyz789 > migrated.csv
# Import to target list
cakemail contacts import --list-id 200 --file migrated.csv
Deduplication before import
# Get existing emails
cakemail contacts list --list-id 123 --format json --batch \
| jq -r '.data[].email' | sort > existing.txt
# Filter new contacts only
head -1 new_contacts.csv > unique.csv
tail -n +2 new_contacts.csv | while IFS=, read -r email rest; do
grep -q "^$email$" existing.txt || echo "$email,$rest" >> unique.csv
done
cakemail contacts import --list-id 123 --file unique.csv
CI/CD integration
GitHub Actions
name: Send weekly newsletter
on:
schedule:
- cron: '0 8 * * 1' # Monday 8 AM UTC
jobs:
newsletter:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install CLI
run: npm install -g @cakemail-org/cakemail-cli
- name: Create and schedule campaign
env:
CAKEMAIL_ACCESS_TOKEN: ${{ secrets.CAKEMAIL_TOKEN }}
run: |
CAMPAIGN_ID=$(cakemail campaigns create \
--name "Newsletter $(date +%Y-%m-%d)" \
--format json | jq -r '.data.id')
cakemail campaigns schedule $CAMPAIGN_ID
Docker
FROM node:20-alpine
RUN npm install -g @cakemail-org/cakemail-cli
ENTRYPOINT ["cakemail"]
docker build -t cakemail-cli .
docker run -e CAKEMAIL_ACCESS_TOKEN=xxx cakemail-cli campaigns list
Multi-environment setup
Use different credentials per environment:
# In GitHub Actions
env:
CAKEMAIL_ACCESS_TOKEN: ${{ github.ref == 'refs/heads/main'
&& secrets.CAKEMAIL_PROD_TOKEN
|| secrets.CAKEMAIL_STAGING_TOKEN }}
Campaign lifecycle automation
State diagram
draft → scheduled → sending → sent → archived
↓ ↓ ↓
delete unschedule suspend
Emergency stop
STATUS=$(cakemail campaigns get 123 --format json | jq -r '.data.status')
case $STATUS in
sending)
cakemail campaigns suspend 123
echo "Campaign suspended"
;;
scheduled)
cakemail campaigns unschedule 123
echo "Campaign unscheduled (back to draft)"
;;
draft)
echo "Campaign is still in draft"
;;
esac
Health check: detect stale campaigns
# Find drafts older than 30 days
cakemail campaigns list --filter "status==draft" --format json --batch \
| jq -r '.data[] | select(
(.created_on | fromdateiso8601) < (now - 2592000)
) | "\(.id) \(.name) \(.created_on)"'