My way of backup depends on the available options. Normally I use SSH + Midnight Commander (mc), plus a secondary server for storing the backup data. If available, I’m using the
mysqlclient for DB dumps, and if not, then I’m using adminer instead.
Usually I’m connecting to the secondary server, and use mc to open up a Shell / FISH / (S)FTP connection to the primary server, where the WP install resides. Then, on the other panel in mc, I’m navigation to the directory where I want to store the backup files (or create one if it doesnt exist yet), and simple copy everything over. Depending on the size, I may background the copying task.
Depending on the circumstances, eg. when I’m not able to store the data of the WP install in an archive, I might run a direct download of the files using mc instead. After that went through, I’m compressing the file backup using tar + gzip, so it turns into a similar bundle one would normally want to have in the first place.
Last, but not least, I’m copying the backup file bundle to my local system, preferably using SSH / FISH (because its much faster than SFTP). Thus, there are automatically two backups of the system available, and if my own system crashes, and vice versa.
It’s a technique I’m also using for installing new WP + CP sites (and just about any other freely available CMS), because: Server-to-Server connects may utilize the full available bandwidth (eg. 90 - 100 mbit), while here it may be just some measly 12 Mbit
… indeed, this methods stems from my days of meager DSL speed, ie. just a small DSL 2000 line, thanks to a total disinterest of the main Germany (monopoly) telecommuncation provider, ie. the Deutsche Telekom, to install decent data connections on the country side. In big cities, one may have nice upstreams, cable connections and even real fibre glass uplinks, but when you’re living a bit more outside, you quickly will run into issues like this.