I have a delta backup running that executes several times a day. The intent is to use it as a versioned backup of sorts, in other words a time-capsule backup where I can restore files as they were at a given time/date. My problem is that if a backup runs once with a file in one form then that file is modified and overwritten, the next time the backup run it fails as the newest file doesn't match it's recovered hash when Areca does the check. I have un-checked abort on failure in pre-processing but the backup is still deleted if the check fails. Is this by design or am I missing something?
Errors and Warnings :
13-05-22 11:00 - WARNING - 0370304.asc was not properly recovered : its hash (3323c26d51117e6d9aa48841f9a824ec30e7dc2b) is different from the reference hash (d8b4dde1fb3a234f7165560dc475c2804c2dfee2). Simulation flag set to ON - file : C:\Users\MMCDOU~1\AppData\Local\Temp\chk5\0370304.asc
13-05-22 11:00 - ERROR - The created archive - X:/Z Drive Hourly Backup/1872440239/05 - 22 - 2013_8.zip (X:/Z Drive Hourly Backup/1872440239/05 - 22 - 2013_8) - was not successfully checked. It will be deleted.
Last edit: Mark 2013-05-22
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
OK, but that confuses me. Is not the very definition of delta backup to save only the changed bits of files? So if a file changes between backups, why should it fail?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I have a delta backup running that executes several times a day. The intent is to use it as a versioned backup of sorts, in other words a time-capsule backup where I can restore files as they were at a given time/date. My problem is that if a backup runs once with a file in one form then that file is modified and overwritten, the next time the backup run it fails as the newest file doesn't match it's recovered hash when Areca does the check. I have un-checked abort on failure in pre-processing but the backup is still deleted if the check fails. Is this by design or am I missing something?
Errors and Warnings :
13-05-22 11:00 - WARNING - 0370304.asc was not properly recovered : its hash (3323c26d51117e6d9aa48841f9a824ec30e7dc2b) is different from the reference hash (d8b4dde1fb3a234f7165560dc475c2804c2dfee2). Simulation flag set to ON - file : C:\Users\MMCDOU~1\AppData\Local\Temp\chk5\0370304.asc
13-05-22 11:00 - ERROR - The created archive - X:/Z Drive Hourly Backup/1872440239/05 - 22 - 2013_8.zip (X:/Z Drive Hourly Backup/1872440239/05 - 22 - 2013_8) - was not successfully checked. It will be deleted.
Last edit: Mark 2013-05-22
This is by design. Backups are atomic, when any part of the transaction fails, the whole transaction is rolled back.
OK, but that confuses me. Is not the very definition of delta backup to save only the changed bits of files? So if a file changes between backups, why should it fail?