around 100, so no, that's not really feasable. a mechanism to create
multiple backups with a kind of preprocessor could help here...
PAT
On Fri, Jun 12, 2009 at 18:30, Jamie Cameron <jcameron@...> wrote:
> On 12/Jun/2009 05:51 Pat Erler wrote ..
>
> hi,
>
> i backup my users /home directory partition with the file system backup to
> an ftp server. the resulting file get's quite large and for restoring one
> subdir (ie /home/peter) i need to reget the whole 80 gb file and extract the
> file from there (the restore functionality never quite worked for me, i need
> to know how the tar.gz file is named, how the file to be restored is named
> etc, it's very easy to make a mistakes there).
>
> much easier it would be to have all first level subdirs in sepeate files
> (like "df --max-depth=1") - could you implement such a thing, jamie?
>
> I don't think that would be so easy unfortunately .. the backup is done
> using the tar command, which doesn't have the ability to split by directory
> like that.
> How many home directories do you have? Perhaps a better solution would be
> to create separate backups for each one ..
>
> - Jamie
>
>
>
> ------------------------------------------------------------------------------
> Crystal Reports - New Free Runtime and 30 Day Trial
> Check out the new simplified licensing option that enables unlimited
> royalty-free distribution of the report engine for externally facing
> server and web deployment.
> http://p.sf.net/sfu/businessobjects
> -
> Forwarded by the Webmin mailing list at
> webadmin-list@...
> To remove yourself from this list, go to
> http://lists.sourceforge.net/lists/listinfo/webadmin-list
>
>
|