Splitting the archive files?

Help
Beckfield
2009-02-19
2013-04-11
  • Beckfield
    Beckfield
    2009-02-19

    With 2 GB of RAM in my system, I am unable to open my 214 GB archive file with Ark in order to restore. 

    Have you considered having your bkup script automatically split the archive files into chunks?  Perhaps a user-settable size in the configuration program?

    Thanks,
    Ken

     
    • Steve Rosen
      Steve Rosen
      2009-02-20

      As I just posted in a different thread, SLB was really not meant to back up that much data.  As the website (http://simplelinuxbkup.sourceforge.net/) says, it has not been tested with very large backups.

      That said, don't use Ark to open your archive files; in my experience, it doesn't work for large archives.  Instead, try Gnome's archive manager (also known as File Roller), but I don't know if it'll work on a 214 GB file.  It has a better shot though.  Or better yet, use the command line: "tar tzf <archive-filename>" will list all the files in the archive, and "tar xzf <archive-filename> <filename-to-extract>" will extract a single file from the archive.

       
    • Steve Rosen
      Steve Rosen
      2009-02-20

      One more point: On a file that huge, give any tool plenty of time to open the file.  The archive file is compressed using gzip (that's why it has a ".gz" extension), and uncompressing something that big takes time.

       
    • Beckfield
      Beckfield
      2009-02-26

      Thanks, Steve.

      I gave Ark the whole day to open the file.  When I got home from work, there was a message that it had failed.

      I'm experimenting with rsync as well, which may end up being a better solution because of the amount of data involved.

      Anyway, thanks for your reply, and good luck.