It only takes a minute of your time to copy your important files to a drive or the cloud. I (potentially) lost one year of progress on a book I’m writing because of my negligence.
So please don’t be like me.
It only takes a minute of your time to copy your important files to a drive or the cloud. I (potentially) lost one year of progress on a book I’m writing because of my negligence.
So please don’t be like me.
I recommend for writing that people make frequent versioned copies as well. For example after some amount of pages or time spent I’ll copy the file itself and rename it <filename>_backup<date> to try and protect against corruption that happens to part of the file without being noticed and just to have the option of rolling back to a previous iteration or at least looking at it. It can clutter things up a little but if you like you can put the backups in their own folder somewhere. Though this is obviously no substitute for backing up to another device as this method doesn’t protect you against your storage device failing, suffering corruption, malware, etc so it’s more important to do that.
Sorry this happened to you OP.
I’ve never really thought about it outside of the context of code, but I wonder if Git would be handy for this. Plain text and markdown would work perfectly, but there might be better version control systems for things like Office/LibreOffice.
Many backup apps or scripts create differential backups, so even Office documents and whatever else doesn’t play nice with Git is still backed up an additional time when there’s any change detected in the file.
For smaller binary files it’s fine. I use it for my desktop wallpapers. For large binary files there are a bunch of extensions out there like git-lfs or git-annex. Vanilla git can handle binary files up to a few GB IIRC but it gets unusably slow when you throw multiple large binary files at it.