I run a few cloud servers at Linode and Digital Ocean which host stuff for clients. Although I have automated backups in place to backup my clients’ data, I always worry about some catastrophic situation where the entire server goes down and, God forbid, needs to be re-built from scratch.
While places like Digital Ocean advertise you can spin up a new Linux server in a matter of seconds — and no doubt you can — what they don’t tell you is how long it takes to configure that server for real-world use. Depending on how many packages, libraries, and other software you need to install, and all the configuration that goes along with it, you could be looking at 2, 3, or even several hours to rebuild a server. To many of my clients that kind of downtime would be absolutely detrimental (and not too good for my business, either).
So, I’ve been looking for a way to make a complete server backup image. Ideally I’d like to be able to spin up a new Linux box using my most recent image, so I can skip the hours of configuration work and be up and running again much faster.
Why not use rsync, rdiff, etc.?
Yes I could backup most stuff that way, but there are certain files rsync can’t access while they are in use, so in terms of making a complete server image, it’s no good to me.
Why not just pay for your host’s backup solution?
Maybe it’s just me, but I’ve had bad luck with host-provided backups. Linode, for example, offers a backup service for $5/month. I paid for this on one critical server for more than a year before I got a message from them telling me my backups were failing… turns out they don’t support certain types of filesystems, and since my data was on an unsupported filesystem, the backups had actually been useless from day 1. Don’t know why it took them a year to notice or notify me, and they wouldn’t even credit me for the months of worthless “backups” I paid for. (This is one of several reasons I’ve been moving away from Linode, but that’s another story)
Digital Ocean (which has been way better than Linode, in my experience), has a feature where you can take snapshots of servers (or “Droplets”, as they call them) and save images. This works great, except you have to shut down the server in order to make the snapshot. I usually make snapshots whenever I set up a new server, but once it goes live with clients I don’t have the luxury of shutting it down again. Configurations change over time, so what I really need is a way to “hot copy” the server as it’s running.
Hot Copy to the rescue
After looking through lots of different backup options, I found (surprisingly) only one that did what I wanted: R1Soft Linux Hot Copy.
Installation was a nightmare. To get the software you have to register on their website, which I did. That gave me a link to download a zip archive of the software, and a message saying I’d receive an email with the instructions necessary to install it. Well, I never received any email. (And yes, I’m absolutely certain I typed my email address correctly — I double checked)
The documentation on their website is pretty skimpy, and the only other help I could find was a blog post from 2010. Armed with only these 2 sources of help, I set out to install Hot Copy 5.12.1 (the version I was provided when I registered) on a Debian 8 64-bit server. I’ll spare you all the steps and agony I went through because…. it just didn’t work. Long story short, I’m pretty sure that one of the two .deb packages you have to install (r1soft-setup) is corrupted at the source, and was never going to work.
I found R1Soft’s repo site (http://repo.r1soft.com/release/) where they keep previous releases, so I started going backward through the versions until I found one that did work. That journey took me all the way back to version 4.2.1…. but at the moment I don’t care because this version DID install and works beautifully.
How to setup R1Soft Hot Copy
This is just a quick-and-dirty run down of the steps that worked for me on Debian 8 64-bit… adjust as needed to fit your needs, your mileage may very, don’t sue me, and all that…
There are two packages you need: r1soft-hotcopy and r1soft-setup. Once I got the right ones from the repo site, these are the commands I used to install (assuming running as root or sudo):
dpkg -i r1soft-hotcopy-amd64-4.2.1.deb dpkg -i r1soft-setup-amd64-4.2.1.deb hcp-setup --get-module
You’ll want to create a mount point for the hcp virtual device, like this:
mkdir /mnt/hotcopy
(change ‘hotcopy’ to whatever you want to call it)
Now you can start Hot Copy:
hcp -m /mnt/hotcopy /dev/vda1
(change ‘/dev/vda1’ to whatever the source device is you want to image …. on Digital Ocean it’s /dev/vda1. You can check with df -h)
If all worked correctly, you should now have a virtual device /dev/hcp1 mounted on /mnt/hotcopy (or whatever you called it), which is your complete snapshot.
Now what do I do with this snapshot?
You can do lots of things. For example, you could use rsync to copy the snapshot to another device you have mounted, doing something like this:
rsync -avzh /mnt/hotcopy/ /mnt/someotherdevice
(Always be careful with rsync NOT to get your source and destination locations reversed, or you will lose data!)
However, in my case I wanted to sync the snapshot to a remote drive (my home computer) mounted via sshfs. Rsync can’t seem to chown anything on the sshfs filesystem, and losing all files ownership permissions in the backup doesn’t sound like a good plan for being able to quickly restore things later. So instead I just threw everything into a tarball:
tar cvpzf /mnt/sshfs-remote-drive/backup.tar.gz /mnt/hotcopy/.
….which worked perfectly! So now I have a complete backup of my entire server, with all the permissions, symlinks, and other goodness intact.
Closing thoughts
R1Soft’s Hot Copy is free and works amazingly well at what I wanted to do… once I got it running. I wish R1Soft would invest some time into their documentation and, especially, fix their deb packages for the latest version. But considering how cool it is to hot copy a live server without shutting it down, I’m surprised there apparently aren’t any competing products out there.
By the way, here’s a shameless plug: if you’re looking to setup a cloud-based server/VPS, I highly recommend Digital Ocean. I’ve been using them for a lot of stuff and they’ve been completely awesome. And if you use my referral link: https://m.do.co/c/4e7ed58e4679, they’ll give you a $10 credit to get started. (I would recommend them anyway even if I didn’t have a referral link, but what the hell…)
Thanks for taking the time to write about this, I myself have encountered little tidbits of useful software on the net that have little to no documentation available / have run into errors that have little to no results on google…. And many a time have planned to write tutorials / document what I’ve done but more often than not it doesn’t get done.
I’m trying to find a way to backup my production server (hosts 1 website) to a dedicated backup server. likely just /var/www/ and mariadb/sql
at the beginning of this article you mentioned you were currently using automatic backups can you elaborate on this?
I use VMware converter for similar outcomes, it’s free, can be abit slow and sometimes fails at 98% for exact reasons I have not worked out.
It will do Win/Linux and local/network while live.
Haha, you words are funny.
I have encountered same situation with Linode too.
Thanks for this useful artical!
Great! You have a backup… So how do you restore it? Can you please explain how you use it to restore your server?