Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Current »

Requirements

The users .my.cnf file is setup so that the mysql cli tools can connect to the databases server.
If this is not given, then all commands need to be adjusted to provide the required information.

Identify the databases to back up

echo "show databases;" | mysql

Result with Squirro 3.5.5

information_schema
configuration
datasource
emailsender
filtering
fingerprint
machinelearning
mysql
performance_schema
plumber
scheduler
topic
trends
user

Out of these database, information_schema, performance_schema and mysql are system database.

You may or may not be able to back those up. Especially with managed services, e.g. AWS RDS you might not get access to these.

If you can’t back them up, you should create a SQL script that can recreate the required users & permissions as part of the recovery action.

Backup

Single File, all databases

Create a time-stamped, compressed backup of all databases into a single file.

TIMESTAMP=$(date +"%Y_%m_%d_%H_%M_%S")
mysqldump --single-transaction --all-databases | gzip > "squirro_mysql_${TIMESTAMP}.sql.gz"

Single File, all Squirro databases

Create a time-stamped, compressed backup of all Squirro databases into a single file.
See above chapter to ensure you get all databases, as this can differ across releases.

TIMESTAMP=$(date +"%Y_%m_%d_%H_%M_%S")
mysqldump --single-transaction --databases configuration datasource emailsender filtering fingerprint machinelearning plumber scheduler topic trends user | gzip > "squirro_mysql_${TIMESTAMP}.sql.gz"

Individual Files, all databases:

TIMESTAMP=$(date +"%Y_%m_%d_%H_%M_%S")
DATABASES=`echo 'SHOW DATABASES;' |  mysql -N`

for DATABASE in $DATABASES
do
  echo "Backup ${DATABASE}"
  mysqldump --single-transaction --database "${DATABASE}" | gzip > "squirro_mysql_${DATABASE}_${TIMESTAMP}.sql.gz"
done

The result is then a list of files like so:

squirro_mysql_configuration_2022_06_24_16_09_13.sql.gz
squirro_mysql_datasource_2022_06_24_16_09_13.sql.gz
squirro_mysql_emailsender_2022_06_24_16_09_13.sql.gz
...

Individual Files, all Squirro databases

Same as above, avoiding all system databases:

TIMESTAMP=$(date +"%Y_%m_%d_%H_%M_%S")
DATABASES=`echo 'SHOW DATABASES WHERE \`Database\` NOT IN ("information_schema", "mysql", "performance_schema", "sys")' |  mysql -N`

for DATABASE in $DATABASES
do
  echo "Backup ${DATABASE}"
  mysqldump --single-transaction --database "${DATABASE}" | gzip > "squirro_mysql_${DATABASE}_${TIMESTAMP}.sql.gz"
done

Cleaning up old files:

If you don’t purge old files, the disk will eventually fill up, you can use this method to keep e.g only backups of the last 7 days

find . -type f -mtime +7 -iname squirro_mysql\*.sql.gz -delete

Restore

Before a restore can happen, all corresponding Squirro services must be stopped, as otherwise the table locks will prevent the process from suceeding.

The best course of action is to use:

squirro_stop

Restoring any of the above examples works like so:

gunzip <file> | mysql

Note that the restore will of course first drop the database, then re-create it.
Hence it might be advisable to take another backup, before doing a restore

Storing the Backups safely

A backup is no good, if its lost with the server / disk its on.
Hence we highly recommend to move the files off your Squirro server.

In cloud environments moving the files onto S3/S3 equivalent storage with versioning enabled will give you a lot of security. (e.g. using the aws s3 mv command)

We also recommend that you don’t leave the files sitting on the local disk, as the ability to inspect the files could give a bad actor a lot of information.

  • No labels