Backing up wordpress to S3 bucket with cron

So after doing several scripts for backing up wordpress to various places, and primarily S3 buckets, I figured I better put the script here so it is easier for me to find it the next time, and maybe it will be useful for you too 🙂

“Before you act for the greater good, you should know what is great and what is good.”
Jeffrey Fry

So we start with some general niceties

I use cron to run this script that copies all the files under /var/www/html and dumps the db, then zips it up and pushes it to S3

# Set the date format, filename and the directories where your backup files will be placed and which directory will be archived
NOW=$(date +"%Y-%m-%d-%H%M")
FILE="my_website_backup.$NOW.tar"
BACKUP_DIR="/home/ec2-user/backups"
WWW_DIR="/var/www/html/"

[If you wish to make it faster and smarter – filter the files from the html folder, some plugins store the entire cache in there, urrrgghhh]

Now, some DB credentials – again this is my stuff – you put your own ingenious passwords etc.

# MySQL database credentials
DB_USER="admin"
DB_PASS="adminsecret"
DB_NAME="my_website_db"
DB_FILE="my_website_db_dump.$NOW.sql"
DB_HOST="xxyyzz.us-west-1.rds.amazonaws.com"
# or for local db
# DB_HOST="localhost"

Now S3 setings

#S3 BUCKET
S3_BUCKET="my_website_backup"
BUCKET_DIR="backups"

And some convenience settings – removing the path from the backup files – replacing it with a convenient folder names

# Tar transforms for better archive structure
WWW_TRANSFORM='s,^var/www/html,www,'
DB_TRANSFORM='s,^home/ec2-user/backups,database,'

And now for the real stuff, let’s start copying

# Create the archive and the MySQL dump
tar -cvf $BACKUP_DIR/$FILE --transform $WWW_TRANSFORM $WWW_DIR
mysqldump -h $DB_HOST -u$DB_USER -p$DB_PASS -$DB_NAME > $BACKUP_DIR/$DB_FILE

Now we have 2 files, let’s make just one by appendin the db dump into the tar. Why? because it is easier to manage

# Append the dump to the archive, remove the dump and compress the whole archive
tar --append --file=$BACKUP_DIR/$FILE --transform $DB_TRANSFORM $BACKUP_DIR/$DB_FILE
rm $BACKUP_DIR/$DB_FILE
gzip -9 $BACKUP_DIR/$FILE

Now we just push to an S3 bucket.

Nice feature, btw, of S3 buckets is that you can set expiration on files so they get automatically deleted after a certain amount of time. Also different paths on the bucket can have different expirations. Ponder that for a sec…

# Copy file to S3 bucket and remove locally
aws s3 cp $BACKUP_DIR/$FILE.gz s3://$S3_BUCKET/$BUCKET_DIR/$FILE.gz
rm $BACKUP_DIR/$FILE.gz

Now for the whole thing, but before – since I placed it all on a cron, and the cron needs to have the same PATH and access to the aws cli and config, I have inserted a few lines at the beginning – but next time I do it – I will add this to the cron. Maybe.

#!/bin/bash
PATH=/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/opt/aws/bin:/home/ec2-user/.local/bin:/home/ec2-user/bin
AWS_CLOUDWATCH_HOME=/opt/aws/apitools/mon
AWS_PATH=/opt/aws
AWS_AUTO_SCALING_HOME=/opt/aws/apitools/as
AWS_ELB_HOME=/opt/aws/apitools/elb
AWS_CONFIG_FILE="/home/ec2-user/.aws/config"

# This script creates a compressed backup archive of WordPress and MySQL
# Feel free to use this script wherever you want, however you want
# Author: Ellie Portugali <ellieportugali@gmail.com>  April 2018

# Set the date format, filename and the directories where your backup files will be placed and which directory will be archived
NOW=$(date +"%Y-%m-%d-%H%M")
FILE="my_website_backup.$NOW.tar"
BACKUP_DIR="/home/ec2-user/backups"
WWW_DIR="/var/www/html/"

# MySQL database credentials
DB_USER="admin"
DB_PASS="adminsecret"
DB_NAME="my_website_db"
DB_FILE="my_website_db_dump.$NOW.sql"
DB_HOST="xxyyzz.us-west-1.rds.amazonaws.com"
# or for local db
# DB_HOST="localhost"

#S3 BUCKET
S3_BUCKET="my_website_backup"
BUCKET_DIR="backups"

# Tar transforms for better archive structure
WWW_TRANSFORM='s,^var/www/html,www,'
DB_TRANSFORM='s,^home/ec2-user/backups,database,'

# Create the archive and the MySQL dump
tar -cvf $BACKUP_DIR/$FILE --transform $WWW_TRANSFORM $WWW_DIR
mysqldump -h $DB_HOST -u$DB_USER -p$DB_PASS -$DB_NAME > $BACKUP_DIR/$DB_FILE

# Append the dump to the archive, remove the dump and compress the whole archive
tar --append --file=$BACKUP_DIR/$FILE --transform $DB_TRANSFORM $BACKUP_DIR/$DB_FILE
rm $BACKUP_DIR/$DB_FILE
gzip -9 $BACKUP_DIR/$FILE

# Copy file to S3 bucket and remove locally
aws s3 cp $BACKUP_DIR/$FILE.gz s3://$S3_BUCKET/$BUCKET_DIR/$FILE.gz
rm $BACKUP_DIR/$FILE.gz

Did this help?

Let me know.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s