Hi DO Community!
I’m currently running an auction script on one of my droplets and the weekly backups that DO provides just aren’t frequent enough for bidding data. Some auctions open and close within a few days. Whats the best way to backup the auction data in my database?
I’m running MySQL on a LAMP/Ubuntu stack.
I see the new article on Bacula on the community homepage. Is that a reliable application and is it hard to setup? Is that my best option?
Thanks!
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
I put my script on github:
if you are interested…
I have a script that you can run on a cronjob that will make a dump of your databases, and also backup your website files. The script will backup to wherever you want, but lately I have been backing up to Amazon S3 buckets. If your droplet ever crashes it is nice to have complete backups of recent databases and website files on a seperate location.
I will work on the tutorial today.
If you want to do prep, you can:
-create an AWS account: https://aws.amazon.com/ -create a database user that will only be for backing up stuff. You can give that user the following global permissions: select file show view super process reload show databases lock tables replication client
This comment has been deleted
Click below to sign up and get $100 of credit to try our products over 60 days!
right now no. I’m just downloading a db backup manually via phpmyadmin. I would like to setup auto-backups to a remote location. I was thinking of creating a droplet just to store the backups.
do you have a location you are backing up to?
I ask because I am going to write a tutorial, and I am trying to get an idea of what people are doing.