Trying to back up a directory to a 7zip archive but only update the information in the archive.
Tell me what I'm doing wrong here:
C:\7za\7za.exe -tzip -ms=off -wE:\ArchiveLocation\ -up2q0r2x2y2z0w2!ArchiveName.7z D:\DirToBeBackedUp\
I'm clearly screwing this up horribly. The objective is to make the backup as quick and efficient as possible. Its a giant folder with about 25 gigs of script files, text files, spread sheets, etc. Files range in size from about 2 bytes to 4 gigabytes. About 30k files in all.
As you can see, I am turning solid archive off because that's apparently better for updates. I'm setting the working directory to the archive location so the temp file and the archive are in the same place. I am trying to use the update conditions but I'm pretty sure that's screwed up. And then finally it cites the directory I want backed up.
Thank you in advance or any help. I wish 7zip gave better command line feedback as to why it wasn't happy.
I don't understand your question.
-tzip and (ArchiveName.7z) is collision already.
you must specify some command:
C:\7za\7za.exe a -ms=off E:\ArchiveLocation\archivename.7z D:\DirToBeBackedUp\
This works no problem. What I want is to update the resulting archive and only update the files that have changed. As I said, there are tens of thousands of files and about 25 gigs of data. But very little of it changes on a daily basis. The only way a backup is efficient is if only the files that change are updated while the rest of the archive is left alone.
Can you help me with the syntax please?
C:\7za\7za.exe u -ms=off E:\ArchiveLocation\archivename.7z D:\DirToBeBackedUp\
Can you confirm that this
Will only replace OLDER files and not every single file in the archive even if its identical?
My concern here is that this is only viable if we're only backing up changed files instead of every single file in the directory every single time.
Ideally, I'd like to run this script a few times a day and then have the pathing determined by Grandfather, father, son type script.
I have written all of that already and I have a great backup script already that works using Robocopy.
Its seriously a really great backup script. Its better then Symanetic Exec or any of the other very expensive backup programs I've played with... Sure, the script only works in user mode but that doesn't matter since the server always logs into the admin account anyway. I set the script to run at intervals.
The only problem with the script is that the files are uncompressed which means they gobble up 25 gigs with every iteration. I'm okay with that if I must but it feels sloppy.
So rather then use Robocopy, I'm trying to use 7-Zip.
At some point I'll build shadowcopy into the script and possibly some sort of block level compare system. And then we'll see about using one of the torrent based P2P syncing systems to push backups between offices.
Anyway, just explaining what I'm doing here. I make monster batch files. Basically crude programming in batch script. My last batch file was about 500k. Just endless decision trees, variable processing, etc.
7-Zip creates new archive (with temp name), then it copies unchanged data, writes changed data, then renames temp file to main file.
So this would not be suitable for a network update... that's unfortunate. Well good news is that the resulting file size isn't that big so I can just eat it but obviously it would be nice if updates could happen without creating a new archive.
Guess I'll just have the batch file create the archive on a local drive and then push the resulting archive to whichever location is right via GFS.
i didn't find a button on your site (to igor) to donate or something like this. i don't have an intention to put there million bucks but i can make a donation like 10-15 bucks (as far as I use your programm) and i think such a button wouldn't be excessive on the product site.
p.s. don't take it in a wrong way - nothing insulting or whatever. thanks.