While the thread started as an... opps... I left some text highlighted and had a fumble finger that was then compounded by the bad copy of the file being saved over the top of the good 'backup' of the file. In addition, I just read that aroslav lost a bunch of work due to a liquid incident with his writing impliment of choice. I have been following this thread with great interest.
While I am not a writer of erotic fiction, some have said my process documents are works of fiction, horror to be exact. However, I would still like to offer some suggestions if I may. The focus here is on 'home' usage, not enterprise and single machine, not the whole home. Given this focus, there seems to me that there are some clear requirements here concerning backing up copies of a creator's work.
First, it cannot be too difficult, either in setup or usage. Second, it must inexpensive as not all have the same means. Third, it must be cross-platform and writing tool agnostic. Fourth, it must be automated. Fifth, encryption support. Finally, it must be bandwidth friendly.
One other thing, best practice says you should have 3 back up copies of the data, one local, one remote and one off-site. I'll not tackle the best practices for how many 'versions' of said back ups should be kept, that is a whole topic in and of itself that is not needed in this context.
OK... so here we go...
I have seen both Arq and Duplicati mention here. Both are excellent choices for this scenario. They are cross-platform, they both run automatically on a schedule, they both support deduplication, they both support, local, remote, and off-site backups, they both encrypt. I suggest you choose one or the other.
A very, very simple explanation of dedup is that it is a technology by which any single give 'chunk' of data is saved only once even if it used in multiple versions of a backup. This means that the tools will only 'back up' new chunks which makes it very bandwidth and space usage efficient. The big bandwidth hit is the initial back up where everything must be copied to the destination the first time.
Arq is paid software and Dupicati is open source. Arq costs money, 50$ US, one time fee, but that also provides a known level of support. For an extra 7$ US a month you can get 1TB of cloud storage from Arq. Duplicati is free but support is community based. I use a cloud service for off-site storage, not Arq. I use both Arq V5 and Duplicati 2.0.4-Canary and both work as expected.
For ease of use, I would choose Duplicati actually, but both provide a rich UI with which to interact with the tool. Arq is its own application while Duplicati is browser based.
Both allow you to choose what you want to back up and both provide the ability to create multiple data sets. So you can choose to back up all of your documents folder or some subset of it. You can choose to back up documents and photos folders, or any combination of folders and sub-folders really.
Both have a built-in scheduler that will allow you to specify a recurring time or specific time or both to execute a back up run. Both allow you to choose how many copies you keep, you can limit the bandwidth you use, you can use all the major cloud environments or not.
The basic scenario I follow is, a single backup set, backup every hour and keep at least one copy of any given file. I do not do a local backup, only a remote and off-site. I use a NAS (remote) and Backblaze (off-site). I use a NAS because it sits on my home network, but for you laptop road warriors, you can use a USB drive, either spinning rust or SSD in place of the NAS, but not a USB key, dongle or stick, they are very, very prone to failure when used continuously. I have email alerts set to send on error only.
You can use any supported cloud vendor for your off-site Arq and Duplicati provide a list of supported vendors. I use Backblaze B2, think 'cold storage', rather than hot storage like OneDrive or Google Drive. Why? Because this is back up not an extension of my local drive storage to the cloud, so I don't need the 'speed' of a OneDrive for back ups. Backblaze B2 storage costs me $1.70 US per month for 250 GB of storage. I was up to $5.50 US for a TB at one point in 2019. B2 is usage based so you pay for what you use rather than a chunk of space you may or may not use. I am sure you can find other cloud vendors with similar storage, Amazon Glacier comes to mind.
As a generalized guide, use a cross-platform tool like Arq or Duplicati with the ability to dedup and have multiple 'copies' of the back up. Use a back up tool that is automated so you never have to 'remember' to compress and copy. Use a USB drive or other attached storage for 'remote' backups and cold storage in the cloud for off-site. Don't get me wrong, you can use platform specific tooling, like Apple's Time Machine to great effect, but platform specific is platform specific. ;-)
Finally, back ups are only half the battle, you must also restore from your back up. It does you no good to back up and then not be able to retrieve that data. As a general rule, I would run a verify at least every 3 months or so if there have been no errors reported during back ups and a restore of something every 6 months. As always do as I say not as I do. :-)
Apologies if I have overstepped with this note. And I hope someone will find it helpful.
Later
Damoose