With 2 GB of RAM in my system, I am unable to open my 214 GB archive file with Ark in order to restore.
Have you considered having your bkup script automatically split the archive files into chunks? Perhaps a user-settable size in the configuration program?
Thanks,
Ken
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
As I just posted in a different thread, SLB was really not meant to back up that much data. As the website (http://simplelinuxbkup.sourceforge.net/) says, it has not been tested with very large backups.
That said, don't use Ark to open your archive files; in my experience, it doesn't work for large archives. Instead, try Gnome's archive manager (also known as File Roller), but I don't know if it'll work on a 214 GB file. It has a better shot though. Or better yet, use the command line: "tar tzf <archive-filename>" will list all the files in the archive, and "tar xzf <archive-filename> <filename-to-extract>" will extract a single file from the archive.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
One more point: On a file that huge, give any tool plenty of time to open the file. The archive file is compressed using gzip (that's why it has a ".gz" extension), and uncompressing something that big takes time.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
With 2 GB of RAM in my system, I am unable to open my 214 GB archive file with Ark in order to restore.
Have you considered having your bkup script automatically split the archive files into chunks? Perhaps a user-settable size in the configuration program?
Thanks,
Ken
As I just posted in a different thread, SLB was really not meant to back up that much data. As the website (http://simplelinuxbkup.sourceforge.net/) says, it has not been tested with very large backups.
That said, don't use Ark to open your archive files; in my experience, it doesn't work for large archives. Instead, try Gnome's archive manager (also known as File Roller), but I don't know if it'll work on a 214 GB file. It has a better shot though. Or better yet, use the command line: "tar tzf <archive-filename>" will list all the files in the archive, and "tar xzf <archive-filename> <filename-to-extract>" will extract a single file from the archive.
One more point: On a file that huge, give any tool plenty of time to open the file. The archive file is compressed using gzip (that's why it has a ".gz" extension), and uncompressing something that big takes time.
Thanks, Steve.
I gave Ark the whole day to open the file. When I got home from work, there was a message that it had failed.
I'm experimenting with rsync as well, which may end up being a better solution because of the amount of data involved.
Anyway, thanks for your reply, and good luck.