There have been a lot of changes in my personal "chaos manor" (perhaps the reader will remember the column by that name by the late Jerry Pournelle in Byte). Moving my office so that it is now adjacent to the workshop took a lot of work. Indeed, I have yet to complete rearranging things. However one of the first thing I wanted to set up was a reasonable backup strategy... once again. In this post, I discuss an important part of that endeavour, backing up the home automation server.
Table of Contents
- Local Backup Network
- Pushing a Domoticz Database Backup
- Pulling a Domoticz Database Backup
- Bells and Whistle
- Conclusion
Local Backup Network
The following image shows a part of the hybrid home network which will be called the local backup network.

A Raspberry Pi, running the last version of Raspbian before Raspberry Pi OS was introduced, is the host system for Domoticz which is the home automation system overseeing numerous IoT devices about the house. The home automation system is mostly wireless, running on the 2.4 MHz Wi-Fi band. As can be expected from a home automation controller, this Raspberry Pi is always running.
The backup server and my desktop are connected to the local area network with a 1 Gb/s Ethernet bus. The backup server is not a very powerful system, it is a 3.2 GHz dual-core Pentium 4 surplus computer bought from my employer just before I retired. It has 3 GB of memory and 1 TB hard drive. It is run as a headless server with a fresh installation of Debian 4.19.118-2 (2020-04-29) x86_64. I was a bit surprised to find that distribution was even more sparse than the "lite" versions of Raspbian and Armbian meant for small single-board computers like the Raspberry Pi and La Frite and so on. The server is connected to mains power through a Wi-Fi switch which means that it can be remotely turned on or off as desired. Most of the time it is off.
The desktop is an older consumer grade I7 based machine from a major player. I have just replaced two 1 TB hard drives on the machine with two 0.5 TB solid-state drives. There is an additional hard drive with three 1 TB partitions. One is dedicated to Timeshift system snapshots, the second is used for storing photographs and the third is a backup of important directories copied from the original two hard drives and backup server that I wanted to save and have yet to install back on the SSDs. Like the backup server, that machine can be off at any particular moment.
The hardware modifications of the desktop was the impetus for rethinking my backup strategy. Here is what is planned at the moment.
- Already mentioned, there is Timeshift which backups the desktop system to the hard drive on the same machine. Currently, only the default settings are in place. I am not too certain if that is appropriate. Timeshift was in place in the previous Ubuntu build for at least a year but as I never used to restore the system I never verified if the tweaks I had made to the settings were useful.
- A version control system, Mercurial, is in place on the desktop for source code (including the source for this site) management. Using the VCS, I manually push the working repositories to bare repositories on the second SDD and on to the backup server when a significant change is made.
- Syncthing is installed on all three systems.
- The complete source directory for the site (except for the version control directory) is pushed onto the backup server on a continuous basis. Only the source code and images needed to construct the Web site are under version control. There remains a considerable number of files containing reference material, posts in various states of preparation, extra images, and so on that is useful enough to warrant being backed up.
- Three directories on the Raspberry Pi are continuously synchronized with directories on the other two computers. These are the two system script directories and the webb server root directory. This means that I can create a new version of the firmware for an IoT device in the Arduino IDE or PlatformIO on the desktop, save it to the synchronized directory on the desktop and in a few minutes the firmware can be uploaded to the device from the Raspberry Pi.
- The directory containing my password database on my desktop is synchronized with directories on the other two devices on the backup network. At the same time the directory is synchronized with versions on tablets and the portable computer so that I can add or change a password on any of the devices that I use to connect to outside resources and the change is automatically propagated to the other machines backed up.
The Domoticz database used to be backed up to the other systems with Syncthing. However, I need to rethink that approach as the database is being updated very frequently. This would result in frequent transfers of the complete database. While I never noticed a bandwidth problem, I would prefer something less obtrusive. Hence this post about sending compressed backup copies of the database at less frequent intervals from the Raspeberry Pi to the other machines on the network.
Pushing a Domoticz Database Backup
It was not at all difficult to find a script to take care of what I wanted. The Domoticz wiki has a page about this very topic, Script to backup to FTP-server (only Domoticz database). Here is a slightly modified version. Beside using the SFTP protocol (File transfer over SSH) instead of the FTP protocol, I have included some error reporting to the system log.
~/.local/bin/upload The script, imaginatively named upload, is made executable. Since it is in the ~/.local/bin directory which is included in the PATH, it can be launched very simply.
While the database was slightly more than 1M bytes, only 243K bytes needed to be transferred to the backup server after compression.
When the backup server is not on line, the following error occurs.
The error code is logged.
The exit codes are listed at the end on the curl man page.
Look carefully at the transffered size. Initially, I was getting ridiculously small sizes, such as 111 bytes as seen below.
When this happened, I ran the actual curl command without redirecting the output to a file to see what was happening.
I could have looked at the downloaded file on backserver to find the same information. The point is that the Domoticz web server is reporting an authorization error. As it happens, password protection is enabled, and 127.0.0.1 is not in the list of local (no authorization required) networks. Consequently, Domoticz would not accept any request from 127.0.0.1 without the password. Similarly localhost, which gets resolved to 127.0.0.1, would not work.
Pulling a Domoticz Database Backup
There is a problem with the previous script. It will obviously fail if the backup server is not on line, which is often true in my home system. The solution is to have the backup server "pull" the Domoticz database. This is actually quite easy to do.
There are two "problems" associated with this approach. The first is rather obvious: no backup will be done if the backup server is not turned on. I have some ideas about that but let's kick that down the road. The other problem with this approach is that the rather big uncompressed database is being sent over the LAN do the local backup server and it's the latter that compresses the file. That's easy enough to fix. Let the download script merely the upload script on the Domoticiz host machine.
The versatile SSH protocol makes it easy to execute a command on a remote machine. Here is an example where the uname command will be run on the Raspberry Pi from the backup server.
The good news is that uname was executed as desired. The bad news is that it was necessary to enter a password. This is not acceptable for a utility that should eventually be run automatically at regular intervals. Well, this is Linux so someone has already encountered this situation and provide mere mortals like myself with a solution. Actually, I know of two solutions. The first I tried was installing sshpass.
Now that we know that this works, it's a just a small step to apply this technique to launching the upload script on the Raspberry Pi.
There is an even easier solution, although I have been avoiding it for quite some time: just set up an SSH key pair. In addition to the usual common username/password method of establishing the identity of a user trying to connect, SSH uses public-key (also called asymmetric) cryptography. Contrary to what I thought, this is a simple two-step procedure. The first step is to generate a public and a private key on the backup server.
The next step is to send the generated public key over to the Raspberry Pi.
As suggested, let's try to execute a remote command.
Because that worked then, not surprisingly, it it possible to download the Domoticz database by remotely invoking the Raspberry Pi upload script without need to supply any password.
When using ssh to remotely execute the upload script on the Raspberry Pi, it is necessary to specify the complete path to the script. This is because /home/woopi/.local/bin is added to the PATH environment from the .profile configuration file when the user woopi logs in. But there is no login in this situation.
Bells and Whistle
If the backup server were always on, I would not bother with "pulling" the database to back it up as shown in the last section. Instead, I would simply set up a regularly scheduled cron job on the Raspberry Pi to push the database to the backup server. Here is an example.
Add the following line, ajusting the time as desired.
As set up above, upload will be executed twice each day, at 6:30 and 18:30. This is a much lower frequency than the hourly backups done by Domoticz itself if automatic backups are enabled (SetupSettingsSystem). However, currently not many changes are being made to the home automation system so only device logs are likely to be lost in 12 hours, which in my case is not that important. In any case, changing the frequency of the backups is easily done.
As I have explained, the backup server is mostly off line and when it is turned on, to push changes to a local VCS repository, it will not be on line for a very long time. So I have adopted a different approach. I created a cron task on the backup server that is executed each time it is booted. In essence, the Domoticz database will be backed up each time I back up some source code.
Since this was the first time the crontab file was edited on the backup server, the editor to use had to be specified. There were only two choices offered on that lean Debian system but, thankfully, nano was one of them. I continue to stubbornly avoid learning vim.
I added the following line to the file.
Each time the machine is booted, the saveddb (for "save domoticz database") script will be executed after a four-minute delay. The local script was added to provide extra functionality. I don't want database backups to accumulate, so any backup file more than a month old is deleted before downloading the current Domoticz database. Since I am a bit paranoid, older backups are only deleted if there will remain at least one backup database (necessarily younger than a month) in the directory.
Wait there's more to come... Being the forgetful kind, I decided to add an e-mail warning if more than 15 days have elapsed since the last backup of the Domoticiz database. There is not much to that. Basically, a time stamp is created each time the database is backed up with the upload script. Since the current time is being used in that script, only one line had to be added at the end of it.
I could just as easily put touch ~/domoticz/backup.stamp as the content of the file is never used. Here is the Python script that checks how old the time stamp is and sends an e-mail if necessary.
The Python script, named checkbackup, was saved in the same ~/.syspy Python 3 virtual environment as the pymail script which it uses (see Raspberry Pi and Domoticz Watchdog for more details). The last bit to make this functionnal is to execute the script at regular intervals with cron. Don't forget to make the script executable beforehand.
The following line is added to the crontab file.
Each morning, at 7:15, I will receive an email if a backup has not been done in the last 15 days.
Conclusion
This is not really a conclusion because doing proper backups is an ongoing problem which warrants new approaches every now and then. The next step in that never-ending quest is off-site backups. Perhaps if I come up with a novel solution, I will present it in a future post.
There are many ways to achieve my goal. I am sure that it would be possible to obtain pretty much the same thing as described above with Syncthing with the appropriate settings. I may go back to that approach latter. Indeed, I may try both simultaneously for a while. Another possibility is the venerable rsync.
