Server backup solutions

As kind of an extension to What Cloud Storage Provider Do You Use?.

This is about backups as opposed to just storage.

What’s the general consensus for off-site server backups?

I have a VPS on which I manage all of my clients’ websites. I’m responsible for backups etc. which I do nightly. Currently I’m using BackBlaze but I’m finding it too unreliable. I backup each site individually but some sites take up >5GB compressed disk space and i find BackBlaze starts to become unreliable at that point, with backups failing often.

So, just wondering. What do others do for backups?

PS - I’m not interested in integrated WHM/cpanel solutions such as r1soft (yuk!) or JetBackup.


I generally use SSH and WP-CLI (and other Linux/Unix command-line tools) for this.

The exact process depends on the site and how it is configured (for example, if the site’s code is stored in git, then you really only need to back up the database and the uploads directory apart from that).

Here is one way to do it though:

# Change to the site's directory
cd ~/public_html # or wherever the site lives
# Archive the site files
# Note you should not make your backups publicly accessible!
tar cvf ~/backups/files-20200227.tar.gz .
# Archive the site database
wp db export - | gzip -c > ~/backups/db-20200227.sql.gz

You can then copy the files over to another server using scp.

Or, you can make the entire process run on another server (using rsync for files, and ssh wp db export ... for the database dump), and then add incremental backups on top of that.

Setting up and maintaining such a backup system using the free, open-source Linux/Unix tools is one of the services I provide for my clients. The “right” solution usually looks a bit different depending on their needs and their server setup.


I currently use rclone together with Microsoft onedrive, it is like an rsync for cloud storages. I Mount OneDrive as a file system using rclone, that way, I can drop in and out easily.


Storing backups offsite is a must.

I’ve been using Updraft + remote storage for daily backups.


I use shared hosting so my options are more limited. I have a php script in every account that runs nightly and sends a full cPanel backup to a 500GB storage VPS I use somewhere in Eastern Europe.

I keep at least 2 months worth of daily backups on there, and on the 1st of every month I download that day’s backup and upload it to BackBlaze for long-term storage.

My hosting provider also offers hourly Acronis backups.


What do they have on there that is >5GB compressed? I was occasionally getting backup failures, which was a pain as they left the backup file on the server. So I have another script that runs as a cleanup to delete any remnant backup files.


I always look for SSH access, even with shared hosting. As long as you have that (and some command-line knowledge) then you can set up any desired workflow using standard tools.


I actually have a number of backups on the go. I wrote a bash script using rsync and ssh to download changes from each account (around 40+) to a server I have in my office. I also backup that server to a site elsewhere. But I also use BackBlaze as my main backup.

As for why >5GB compressed. Some sites are massively image heavy (e.g. photographers) but, I also manage email for clients and some have over 20GB of email which also gets backed up. And I also backup server config files too.

But it’s just the problem I have with Backblaze when files get to around 5GB+ and as this is my main backup, I was hoping someone would have a magic solution. Part of the problem is that BB is not multi-threaded so that isn’t going to help.

We also had problems with BackBlaze, where our cPanel backups were failing for no reason. We have usedAWS for our cPanel backups (over 105 GB) per day without issue, and all our UpdraftPlus backups as well. So probably sending 150 GB per day to AWS - rarely have an issue.

I find BackBlaze easier to use, but AWS is extremely reliable. So my vote is AWS.



Yes, I was thinking it must be either photographers or mail. :grin:

I have been making a concerted effort over the years to get my clients to use a separate email solution (Fastmail or G Suite usually). I still have a few sticklers but they can only use POP or short term IMAP. And I certainly wouldn’t offer to back up their mail for them


That’s a good point. I do have SSH access but had forgotten about that option. Might be worth investigating.

1 Like

What exactly are you using to create backups that are uploaded to Backblaze?

We have multiple servers, multiple websites, some in GB and rarely have any issues uploading to Backblaze.

Hashbackup is a reliable tool, integrates with many storage providers. We use custom bash scripts worry B2 CLI to upload backups.

1 Like

I don’t think it’s that uncommon tbh. I don’t think WHM and Backblaze work well with each other. Other people with similar problem (not solved):

I’ve used the WHM backup and also custom bash scripts with the B2 Python script. I have more success with the B2 script but still get failures on files over 5GB. As mentioned above, other people have had similar problems.

Thanks for mentioning Hashbackup. I’ll take a look at that.

The photographers I don’t mind. But email is a real pain. No one ever deletes anything. Ever. I have been thinking about getting people to use separate email systems but, with all due respects to (some) of my clients, they wouldn’t know where to begin. And I’d still end up managing it for them.

This would have the benefit of not having to backup year’s of junk but the very thought of going through the process of moving people to new email systems is just too much to cope with. :grimacing:

1 Like

Yes, it can be hard work to convince them and get them moved. But it’s worth it!


I have an in-house backup server.
Remote servers have cron jobs to do things like database dumps.
Then the in-house server fetch remote content using rsync over ssh and keep rotated copies when needed.
Then the backup server has weekly backup on 2 external drive (that are rotated).


My way of backup depends on the available options. Normally I use SSH + Midnight Commander (mc), plus a secondary server for storing the backup data. If available, I’m using the mysqlclient for DB dumps, and if not, then I’m using adminer instead.

Usually I’m connecting to the secondary server, and use mc to open up a Shell / FISH / (S)FTP connection to the primary server, where the WP install resides. Then, on the other panel in mc, I’m navigation to the directory where I want to store the backup files (or create one if it doesnt exist yet), and simple copy everything over. Depending on the size, I may background the copying task.

Depending on the circumstances, eg. when I’m not able to store the data of the WP install in an archive, I might run a direct download of the files using mc instead. After that went through, I’m compressing the file backup using tar + gzip, so it turns into a similar bundle one would normally want to have in the first place.

Last, but not least, I’m copying the backup file bundle to my local system, preferably using SSH / FISH (because its much faster than SFTP). Thus, there are automatically two backups of the system available, and if my own system crashes, and vice versa.

It’s a technique I’m also using for installing new WP + CP sites (and just about any other freely available CMS), because: Server-to-Server connects may utilize the full available bandwidth (eg. 90 - 100 mbit), while here it may be just some measly 12 Mbit :slight_smile:

… indeed, this methods stems from my days of meager DSL speed, ie. just a small DSL 2000 line, thanks to a total disinterest of the main Germany (monopoly) telecommuncation provider, ie. the Deutsche Telekom, to install decent data connections on the country side. In big cities, one may have nice upstreams, cable connections and even real fibre glass uplinks, but when you’re living a bit more outside, you quickly will run into issues like this.

cu, w0lf.

1 Like

Thanks everyone. Plenty of food for thought. It also seems that most people use home grown solutions of some sort. My own backup solutions seem very similar.

But as for the problems with Backblaze, it seems I’m going to have to try and get each individual backup below 5GB by hook or by crook. A couple of thoughts spring to mind here: 1) I could try to only backup email that is less than 2 year’s old and 2) I could avoid backing up image thumbnails as these can easily be regenerated.

So thanks again y’all.

Have you tried contacting Backblaze support? Or post an issue in GitHub for B2 CLI? They’ve helped me before on GitHub, pretty responsive.

Hashbackup I mentioned before can do incremental backups, “HashBackup is designed for “incremental forever” backups to minimize backup time, transmission costs, and storage costs, while providing traditional backup features such as multiple retention periods and fast restore times. Unlike traditional incremental backups where a full backup followed by many incrementals is restored, HashBackup uses a block-level incremental strategy that can restore any version directly and is designed to efficiently handle backups with thousands of incremental versions.”

So that can be a good way to keep files under 5GB.

1 Like