Looking for backup strategy for write-intensive mysql database

June 13, 2017 130 views
MySQL Backups Ubuntu

Right now the setup consists of $10 droplet with an attached volume with MySQL datadir on the volume (following this guide). The reason for this is to be able to scale storage and cpu/ram separately with little downtime (using the flexible resizing for droplet). All tables are using InnoDB.

That said I'm looking for how to backup this properly. I have set up mysqldump and automysqlbackup (which uses mysqldump, if I remember correctly) before but (if I remember correctly) both of them lock the database while doing the backup which isn't acceptable for write-intensive scenario (monitoring type of thing).

After some googling I found percona hot backup but I didn't really understand how to automate it (looking to save backups to S3). Also I wasn't sure if it's what I'm looking for.

Googling "write-intensive mysql backups" unfortunately doesn't really give anything. Looking for any ideas on how to do this so I can rely on it if/when something happens

  • Are your tables MyISAM or InnoDB?

    If InnoDB can you use the single-transaction or skip-lock-tables flag in mysqldump to prevent the tables being locked?

  • @DigitalCarrot tables are InnoDB. I didn't know about the skip table lock option, I will definitely check it out. Thanks!

1 Answer

Hi @xymanek

Just so everyone else might learn a little - here's how to setup hot backup:
In case you're using MariaDB 10.1+: https://mariadb.com/kb/en/mariadb-backup/

So you're looking for a S3 backup tool? There are several tools out here, but have a look at these:

You would then create a little script, where you would run the backup and then the S3 backup.

Out of interest, when you say write-intensive:
How many queries per second? How many rows added per day? How much storage are you currently using?

This tutorial will show you how to perform a full hot backup of your MySQL or MariaDB databases using Percona XtraBackup on Ubuntu 14.04. The resulting database backup can then be copied to a remote server with a traditional file backup system, like Bacula. The process of restoring the database from a backup is also covered.
  • Thanks for the reply! We are still in development + testing stage so I don't really know what values we will see in production but based on what we see in testing so far I can predict the following:

    • 5 - 10 inserts/second with some updates flying around
    • About 50-100 MiB added every 5 days

    Of course these values will grow over time, but for starters this should be more or less accurate. Also, when I said "write-intensive" I was more thinking about the fact that it's not acceptable to stop writes for minute or two (which was OK for backups of blogs using mysqldump)

    About percona hot backup: I read that guide but due to fact that taking a backup has multiple steps I'm a bit confused on how to script it. Also that guide ends with "After that, you will want to look into ... creating incremental backups (with XtraBackup)". Should I look into them or taking full backups is better in my case?

    Also, since @DigitalCarrot mentioned that mysqldump can be used without locking tables, would you recommend using percona hot backup or mysqldump without table lock?

    • @xymanek

      I prefer the mysqldump, since I got the data in a readable format - but it would be slow to restore.
      But I wouldn't say one solution excludes the other - I would probably run both.
      You can never have enough backup solutions.

Have another answer? Share your knowledge.