Backup and Replication

Some scripts to help with Backup and Replication.

Introduction

Many years ago I slowly and painfully learned to put all my user data onto separate drives away from the “C” drive; so that when windows screwed my C drive, all my user data would remain intact.

25 Years ago I started to dual boot my PC, as Linux gave me so many more options than Windows.

15 years ago I started my own websites based on Linux.

10 years ago I got so fed up with Windows failures that I migrated all my systems to Linux. Everything in 2023 is running UBUNTU except my Desktop. As part of my “keep DATA separate from the system” policy my /home directory is a mount point. Unfortunately, latest UBUNTU uses SNAP, and snap cannot cope with a mounted /home. So my Desktop runs Debian, which doesn’t use snap!

Backup systems under Linux are extensive, highly functional, and free.

Desktop Backup

The first set of scripts backs up my Desktop. My normal desktop has eight disks, both magnetic and SSD. Operating systems are kept on the SSDs for easy replacement. All my user data is on magnetic drives. There are six magnetic disks under a High-point RAID controller, one group of three 2TB disks and another group of three 1TB disks. Each group consists of a mirror pair and a hot standby. This gives me a 1TB drive, a 2TB drive and two 250GB SSDs. Yes I’m paranoid, but I’ve worked in IT for 50 years.

They are here:

Website Backup

The second set of scripts back up my Websites. I have a Webserver in the garage based on an old DELL 2850. I also rent a server from Fasthosts, used originally as a mail system backup.

For many years the DELL machine contained just a simple read-only website based on simple html pages that were very simple to back up.

But then I added a WordPress site and the world went crazy. You cannot just take a copy of a WordPress site as most of the data is held in a mysql database. So a while ago I wrote a backup script to keep a full copy of each wordpress website, including its database, on a NAS server.

Around that time Firefox started moaning about insecure sites – those not using https. But to use https you need certificates, these are easy to get from CertBot, but as they change on a regular basis they also need backing up.

Of course I couldn’t leave it at that, and figured, if I had a backup, then I should be able to recover that backup onto a separate server to produce a replica server of the original site. So that if the original site goes down I just need to switch the DNS records to point the domain to the replicated site on another server. A poor man’s redundancy system!

Back in 2021 I moved house, so the DELL server in the garage would be offline for a while. I used an early version of this script to replicate all the websites on the DELL server onto the rented Fasthosts server. So, on move day I just re-pointed the DNS settings and took down the DELL server.

The details are here: