I agree.
For many non-critical files, where I am making small, incremental changes, I often do not make a fresh backup copy, especially because I have off-platform backups as well; and edit the file directory and save it, as normal.
Like zxmaus, I cannot recall every losing a file due to a system crash while editing a file, in over 40 decades of working with computers.
However, I do recall making a lot of "simple human mistakes" and have learned to be "saved by backups". This leads me to always recommend people make and maintain filesystem backups, based on their risk management model (criticality, vulnerability, threats).
These days, more-often-than-not, for a increasing majority of my file edits, if they are significant, I will
sftp the file to my desktop, open the file in
Visual Studio Code (or cut-and-paste into VSC if a small file) , edit the file using all the available syntax and formatting tools and plugins, and save the edited file with a different name, preserving the original file on my working directory on my desktop, and then I will either
sftp or cut-and-past into the remote server over an
ssh terminal.
I cannot count the number of times VSC has been helpful to spot a syntax error which missed my tired, overworked eyes. The formatting is also useful (indentations, consistent formatting, etc) is also very useful in VSC. These kinds of tools are really time savers, especially for syntax checking.
It goes without saying, I use
vi every day to edit files; but I also use vi in conjunction with
VSC, more and more; for the syntax checking and formatting for code (programming languages) and JSON files, etc. But as I am quick to confess I do edit files with
vi and do not make make a backup copy, but not often; but if it is some small change which i can easily revert-back based on "memory", then I am guilty. I also push files to
private GIT repositories as well, when my work on critical files are done.
GIT is Good for backups