Spythe
By:
Spythe

Backup droplet content to external location (i.e. NAS)

June 29, 2017 391 views
Backups

Hey guys,

I was hoping someone knew a way to make a backup of the complete content of a NAS to an external location, such as Dropbox or a personal NAS? Whenever I destroy a droplet, I'm always afraid taking down valuable data. I have backups and snapshots, but you never know.. the human is stupid and makes mistakes.

Therefor I would like a way to have at least weekly backups to, in my case, my Synology NAS. Who got an idea to complete this?

For the record, I'll always be using Ubuntu snapshots. I was thinking of setting up a simple script that creates a cron job and rsyncs all data. But the naming of those words is all I know..

2 Answers

@Spythe

The solution really depends on the size of the data set -- that's the first thing I generally look at.

If the data set it small, simply generating an archive of the data may prove to be the easiest solution. If the data set is large, rsync may be a better option that way you're only syncing changes instead of repeatedly backing up data that hasn't changed.

You could could use rsync if you went with the archive option as well, of course :).

I'm not too familiar with Synology or how it works, though if you want to setup a cronjob to rsync your files offsite, you'll need a way to connect and generally, that's going to be a static IP or through some sort of dynamic DNS solution (such as https://no-ip.com) on your end.

If you've got something like that setup, a basic guide to rsync can be found here:

https://www.tecmint.com/sync-new-changed-modified-files-rsync-linux/

You can setup a cronjob to run the same commands at a time/frequency of your choosing :-).

On Ubuntu, you can setup a cronjob using crontab -e from the CLI. from there, near the bottom of the file is where you'll actually add the command and setup the frequency at which the job runs.

If you're not familiar with setting up a cronjob, I'd use https://crontab.guru/

..

So for example, our cronjob might look something like (on the Droplet):

0 2 * * 1 rsync -avzhe "ssh -i /path/to/synology/privatekey" /droplet/path/to/files user@synology:/source/for/backups >> /var/log/synology.log

What the above should do is:

1). Run the cronjob every Monday @ 2:00 AM.

2). Recursively sync files from /droplet/path/to/files to /source/for/backups.

3). Store the verbose output of the command to /var/log/synology.log (not required).

If, when the cronjob runs again, no files have been changed, nothing will be synced. There's no need to needlessly sync files that haven't changed since the last backup :-).

..

In the command you'll see ssh -i /path/to/synology/privatekey

The would be the private key that is stored on the Synology NAS -- like any command that requires syncing across servers, you'll need to be able to log in. You can't log in without a password or SSH key, and to prevent storing passwords in plain text, we'll use an SSH key to allow rsync to push the files to the NAS.

Something to keep in mind though -- we're storing a private key on your public server, so if you're ever compromised, the attacker could technically sync a ton of junk to your NAS and delete your backups. So you need to take precaution here. The same command that allows you to get in would allow anyone else to do the same in this case, if they were to gain root access to the Droplet.

So with that in mind, keep security in mind and make sure your Droplet is secure so that's not an issue you're faced with :-).

use disk drill tool or recuva to backup lost files

Have another answer? Share your knowledge.