From: Craig B. <cba...@us...> - 2004-09-10 16:55:40
|
"Pascal Schelcher" writes: > I ask your help about a problem with hardlink files. > > Here is the situation : > I have some file in backups that have "type" marked to "hardlink". > Each file in this situation can not be restored : restore is succes > but no file are effectively restored. And each file in this situation > have a size of approximately 60 bytes. I assume you are using tar. BackupPC represents hardlinks as a text file containing the name of the hardlink target, with meta data denoting the file as a hardlink. So if you look at the file contents you will see the path of the linked-to file. Are you using tar for restore? It should work: the hardlinks should be represented correctly in the tar file and then extracted correctly. Tricky cases where the linked-to file is not included in the restore should also work, but these cases are less tested. Can you be more specific about what you are restoring, whether the linked-to file is included, etc? I need to recreate this. You could then run BackupPC_tarCreate with the relevant arguments (look at the top of the RestoreLOG file) and pipe the output into tar tvf -. Email me the output. Note that rsync in BackupPC doesn't yet support hardlinks - it will do in the next version. So if you are using rsync to restore a backup done with tar, then hardlinks will not be restored. Craig |
From: Craig B. <cba...@us...> - 2004-09-11 09:18:37
|
"Pascal Schelcher" writes: > Yes I use "tar" method to backup and restore files. > The backuped server is a Linux OS. > > I have done a "cat" on one hardlinked file : nothing to see... > Here is the output of one with "hexdump" : > 00000000 78 5e 05 c1 5b 0a 80 40 08 00 c0 13 b9 16 d4 47 |x^..[..@.......G| > 00000010 c7 59 c4 45 41 6a f1 75 fe 66 06 36 56 b0 63 33 |.Y.EAj.u.f.6V.c3| > 00000020 25 97 6f ab 00 da 09 a6 91 30 9d 44 9b 71 f9 f7 |%.o......0.D.q..| > 00000030 e6 34 5c 4a c2 90 4c 82 e7 7d 3c d7 f8 01 f2 55 |.4\J..L..}<....U| > 00000040 16 a3 |..| This file is compressed. Use bin/BackupPC_zcat to uncompress/cat the file. > I have run BackupPC_tarCreate with same argument like with CGI to restore > one file (a hardlinked file). > Here is the output : > Done: 0 files, 0 bytes, 0 dirs, 0 specials, 0 errors > > And for comparison here is output with one file (non hardlinked file) : > (relative path of this file)0100600000011400000140000000250410015366406012652 0ustar mail > (follow by the content of the file which is an e-mail) > > What is the sequence "0100600000011400000140000000250410015366406012652 0ustar mail" ? That's the binary tar file. You need to pipe it into tar tvf - to see the contents of the tar file. Craig |
From: Craig B. <cba...@us...> - 2004-09-18 22:35:38
|
"Pascal Schelcher" writes: > I ask your help about a problem with hardlink files. > > Here is the situation : I have some file in backups that have "type" > marked to "hardlink". Each file in this situation can not be restored > : restore is succes but no file are effectively restored. And each > file in this situation have a size of approximately 60 bytes. I have looked at the code in BackupPC_tarCreate and I think I might have found a bug related to restorsing hardlinks using XferMethod tar when the original file is not included in the files/directories being restored. Here's a patch to try, relative to 2.1.0 (this diff also includes a fix already in 2.1.0pl1). Please tell me if this fixes your problem. Craig --- bin/BackupPC_tarCreate 2004-06-19 19:28:08.000000000 -0700 +++ bin/BackupPC_tarCreate 2004-09-12 02:59:04.243206400 -0700 @@ -230,6 +230,11 @@ my $fh = @_; foreach my $hdr ( @HardLinks ) { $hdr->{size} = 0; + my $name = $hdr->{linkname}; + $name =~ s{^\./}{/}; + if ( defined($HardLinkExtraFiles{$name}) ) { + $hdr->{linkname} = $HardLinkExtraFiles{$name}; + } if ( defined($PathRemove) && substr($hdr->{linkname}, 0, length($PathRemove)+1) eq ".$PathRemove" ) { @@ -423,10 +428,28 @@ TarWriteFileInfo($fh, $hdr); my($data, $size); while ( $f->read(\$data, $BufSize) > 0 ) { + if ( $size + length($data) > $hdr->{size} ) { + print(STDERR "Error: truncating $hdr->{fullPath} to" + . " $hdr->{size} bytes\n"); + $data = substr($data, 0, $hdr->{size} - $size); + $ErrorCnt++; + } TarWrite($fh, \$data); $size += length($data); } $f->close; + if ( $size != $hdr->{size} ) { + print(STDERR "Error: padding $hdr->{fullPath} to $hdr->{size}" + . " bytes from $size bytes\n"); + $ErrorCnt++; + while ( $size < $hdr->{size} ) { + my $len = $hdr->{size} - $size; + $len = $BufSize if ( $len > $BufSize ); + $data = "\0" x $len; + TarWrite($fh, \$data); + $size += $len; + } + } TarWritePad($fh, $size); $FileCnt++; $ByteCnt += $size; @@ -456,7 +479,7 @@ my $done = 0; my $name = $hdr->{linkname}; $name =~ s{^\./}{/}; - if ( $HardLinkExtraFiles{$name} ) { + if ( defined($HardLinkExtraFiles{$name}) ) { $done = 1; } else { foreach my $arg ( @ARGV ) { @@ -478,7 +501,9 @@ # routine, so that we save the hassle of dealing with # mangling, merging and attributes. # - $HardLinkExtraFiles{$hdr->{linkname}} = 1; + my $name = $hdr->{linkname}; + $name =~ s{^\./}{/}; + $HardLinkExtraFiles{$name} = $hdr->{name}; archiveWrite($fh, $hdr->{linkname}, $hdr->{name}); } } elsif ( $hdr->{type} == BPC_FTYPE_SYMLINK ) { |
From: Pascal S. <pas...@ve...> - 2004-09-22 10:24:29
|
Craig, Sorry for my silence. I was very busy. So I have installer v 2.1.0 and I have tried this patch (with juste little difference : some shifts of line). But it don't solve problem. On Web interface, when I click on file of type "hardlink" I have the same error : "Error: Can't restore bad file <relative path to file hardlinked>" The "<relative path to file hardlinked>" is the file referenced in the hardlinked file. The error stands with and without the patch. When I will have more time, I will analyse the problem in depth... Thanks for your time you have taken. Pascal Schelcher. > Here's a patch to try, relative to 2.1.0 (this diff also includes a > fix already in 2.1.0pl1). Please tell me if this fixes your problem. > > Craig > > --- bin/BackupPC_tarCreate 2004-06-19 19:28:08.000000000 -0700 > +++ bin/BackupPC_tarCreate 2004-09-12 02:59:04.243206400 -0700 > @@ -230,6 +230,11 @@ > my $fh = @_; > foreach my $hdr ( @HardLinks ) { > $hdr->{size} = 0; > + my $name = $hdr->{linkname}; > + $name =~ s{^\./}{/}; > + if ( defined($HardLinkExtraFiles{$name}) ) { > + $hdr->{linkname} = $HardLinkExtraFiles{$name}; > + } > if ( defined($PathRemove) > && substr($hdr->{linkname}, 0, length($PathRemove)+1) > eq ".$PathRemove" ) { > @@ -423,10 +428,28 @@ > TarWriteFileInfo($fh, $hdr); > my($data, $size); > while ( $f->read(\$data, $BufSize) > 0 ) { > + if ( $size + length($data) > $hdr->{size} ) { > + print(STDERR "Error: truncating $hdr->{fullPath} to" > + . " $hdr->{size} bytes\n"); > + $data = substr($data, 0, $hdr->{size} - $size); > + $ErrorCnt++; > + } > TarWrite($fh, \$data); > $size += length($data); > } > $f->close; > + if ( $size != $hdr->{size} ) { > + print(STDERR "Error: padding $hdr->{fullPath} to $hdr->{size}" > + . " bytes from $size bytes\n"); > + $ErrorCnt++; > + while ( $size < $hdr->{size} ) { > + my $len = $hdr->{size} - $size; > + $len = $BufSize if ( $len > $BufSize ); > + $data = "\0" x $len; > + TarWrite($fh, \$data); > + $size += $len; > + } > + } > TarWritePad($fh, $size); > $FileCnt++; > $ByteCnt += $size; > @@ -456,7 +479,7 @@ > my $done = 0; > my $name = $hdr->{linkname}; > $name =~ s{^\./}{/}; > - if ( $HardLinkExtraFiles{$name} ) { > + if ( defined($HardLinkExtraFiles{$name}) ) { > $done = 1; > } else { > foreach my $arg ( @ARGV ) { > @@ -478,7 +501,9 @@ > # routine, so that we save the hassle of dealing with > # mangling, merging and attributes. > # > - $HardLinkExtraFiles{$hdr->{linkname}} = 1; > + my $name = $hdr->{linkname}; > + $name =~ s{^\./}{/}; > + $HardLinkExtraFiles{$name} = $hdr->{name}; > archiveWrite($fh, $hdr->{linkname}, $hdr->{name}); > } > } elsif ( $hdr->{type} == BPC_FTYPE_SYMLINK ) { > |
From: Paul F. <pg...@fo...> - 2005-02-02 03:15:05
|
the message i'm replying to, quoted down below, appeared on the list in mid-september. i found it because i was searching for problems similar to mine. was this ever resolved? this is the last message i can find on the topic. i'm running 2.1.0pl1. my symptoms, i believe, are identical to pascal's: - i navigate in my backup tree to a file i know is a hard link. - the icon shows as "broken" -- i.e., not found. - clicking on the file gives "Error: Can't restore bad file <other-file-name>", where "other-file-name" is the name of the file to which this one is hard-linked on the original share. - if i do a restore of this tree (which is how i noticed the problem) the restore claims to be successful, but the file is not restored. (i use tar for everything.) - the file's mate (i.e. "other-file-name" above) is healthy, but the web interface, at least, does not think it's a hard link. it appears as a regular file. - the contents of original file's placeholder in the pool is: # /home/backuppc/bin/BackupPC_zcat findex.html | nod 0: 2e 2f 77 65 62 2f 77 77 77 2e 66 6f 78 68 61 72 ./web/www.foxhar 10: 70 2e 62 6f 73 74 6f 6e 2e 6d 61 2e 75 73 2f 74 p.boston.ma.us/t 20: 2e 68 74 6d 6c .html (this appears plausible: the file "index.html" here is hard-linked to a file named "t.html" elsewhere.) - the attrib file contains: # /home/backuppc/bin/BackupPC_zcat attrib | nod 0: 17 55 55 55 0a 69 6e 64 65 78 2e 68 74 6d 6c 01 .UUU.index.html. 10: 82 83 24 00 00 00 25 3c 33 33 c7 09 76 69 6d 61 ..$...%<33..vima 20: 6e 2e 67 69 66 00 82 83 24 00 00 00 81 af 38 3b n.gif...$.....8; 30: ea ac df (there are only two files in this directory: the "index.html" in question, and another called "viman.gif".) - if i run BackupPC_tarCreate using the args shown in the restore log (as you requested pascal do in another thread message), i get this: bash-2.05$ /home/backuppc/bin/BackupPC_tarCreate -h firethorn -n 5 -s /var2 -t -r /web -p /web/ /web/www.vile.cx | tar -tvf - /home/backuppc/bin/BackupPC_tarCreate: bad share or directory '/var2/./web/www.foxharp.boston.ma.us/t.html' drwxr-xr-x root/root 0 2004-07-29 20:01:53 ./web/www.vile.cx/ -rw-r--r-- root/root 22456 2001-11-08 11:03:43 ./web/www.vile.cx/viman.gif Done: 1 files, 22456 bytes, 1 dirs, 0 specials, 1 errors the path being complained about is a correct path on the original host to the "mate" of the file being restored, but it is not a correct path on the backup server. hope this excessive detail helps, and/or that this has already been fixed and i missed it... paul > Craig, > > Sorry for my silence. I was very busy. > > So I have installer v 2.1.0 and I have tried this patch (with juste little difference : some shifts of line). > But it don't solve problem. > > On Web interface, when I click on file of type "hardlink" I have the same error : > "Error: Can't restore bad file <relative path to file hardlinked>" > The "<relative path to file hardlinked>" is the file referenced in the hardlinked file. > The error stands with and without the patch. > > When I will have more time, I will analyse the problem in depth... > > Thanks for your time you have taken. > > Pascal Schelcher. > > > Here's a patch to try, relative to 2.1.0 (this diff also includes a > > fix already in 2.1.0pl1). Please tell me if this fixes your problem. > > > > Craig > > > > --- bin/BackupPC_tarCreate 2004-06-19 19:28:08.000000000 -0700 > > +++ bin/BackupPC_tarCreate 2004-09-12 02:59:04.243206400 -0700 > > @@ -230,6 +230,11 @@ > > my $fh = @_; > > foreach my $hdr ( @HardLinks ) { > > $hdr->{size} = 0; > > + my $name = $hdr->{linkname}; > > + $name =~ s{^\./}{/}; > > + if ( defined($HardLinkExtraFiles{$name}) ) { > > + $hdr->{linkname} = $HardLinkExtraFiles{$name}; > > + } > > if ( defined($PathRemove) > > && substr($hdr->{linkname}, 0, length($PathRemove)+1) > > eq ".$PathRemove" ) { > > @@ -423,10 +428,28 @@ > > TarWriteFileInfo($fh, $hdr); > > my($data, $size); > > while ( $f->read(\$data, $BufSize) > 0 ) { > > + if ( $size + length($data) > $hdr->{size} ) { > > + print(STDERR "Error: truncating $hdr->{fullPath} to" > > + . " $hdr->{size} bytes\n"); > > + $data = substr($data, 0, $hdr->{size} - $size); > > + $ErrorCnt++; > > + } > > TarWrite($fh, \$data); > > $size += length($data); > > } > > $f->close; > > + if ( $size != $hdr->{size} ) { > > + print(STDERR "Error: padding $hdr->{fullPath} to $hdr->{size}" > > + . " bytes from $size bytes\n"); > > + $ErrorCnt++; > > + while ( $size < $hdr->{size} ) { > > + my $len = $hdr->{size} - $size; > > + $len = $BufSize if ( $len > $BufSize ); > > + $data = "\0" x $len; > > + TarWrite($fh, \$data); > > + $size += $len; > > + } > > + } > > TarWritePad($fh, $size); > > $FileCnt++; > > $ByteCnt += $size; > > @@ -456,7 +479,7 @@ > > my $done = 0; > > my $name = $hdr->{linkname}; > > $name =~ s{^\./}{/}; > > - if ( $HardLinkExtraFiles{$name} ) { > > + if ( defined($HardLinkExtraFiles{$name}) ) { > > $done = 1; > > } else { > > foreach my $arg ( @ARGV ) { > > @@ -478,7 +501,9 @@ > > # routine, so that we save the hassle of dealing with > > # mangling, merging and attributes. > > # > > - $HardLinkExtraFiles{$hdr->{linkname}} = 1; > > + my $name = $hdr->{linkname}; > > + $name =~ s{^\./}{/}; > > + $HardLinkExtraFiles{$name} = $hdr->{name}; > > archiveWrite($fh, $hdr->{linkname}, $hdr->{name}); > > } > > } elsif ( $hdr->{type} == BPC_FTYPE_SYMLINK ) { > > > > > ------------------------------------------------------- > This SF.Net email is sponsored by: YOU BE THE JUDGE. Be one of 170 > Project Admins to receive an Apple iPod Mini FREE for your judgement on > who ports your project to Linux PPC the best. Sponsored by IBM. > Deadline: Sept. 24. Go here: http://sf.net/ppc_contest.php > _______________________________________________ > BackupPC-users mailing list > Bac...@li... > https://lists.sourceforge.net/lists/listinfo/backuppc-users > http://backuppc.sourceforge.net/ =--------------------- paul fox, pg...@fo... (arlington, ma, where it's 22.1 degrees) |
From: Paul F. <pg...@fo...> - 2005-02-02 03:19:33
|
craig -- i should add that i have _not_ tried the patch you offered in the message i quoted. it was long enough ago that i felt i should make sure you still think it's right/current. thanks again, paul i wrote: > the message i'm replying to, quoted down below, appeared on the > list in mid-september. i found it because i was searching for > problems similar to mine. was this ever resolved? this is the > last message i can find on the topic. > > i'm running 2.1.0pl1. > > my symptoms, i believe, are identical to pascal's: > - i navigate in my backup tree to a file i know is a hard link. > - the icon shows as "broken" -- i.e., not found. > - clicking on the file gives "Error: Can't restore bad file > <other-file-name>", where "other-file-name" is the name of > the file to which this one is hard-linked on the original > share. > - if i do a restore of this tree (which is how i noticed the > problem) the restore claims to be successful, but the > file is not restored. (i use tar for everything.) > - the file's mate (i.e. "other-file-name" above) is healthy, but > the web interface, at least, does not think it's a hard link. > it appears as a regular file. > - the contents of original file's placeholder in the pool is: > # /home/backuppc/bin/BackupPC_zcat findex.html | nod > 0: 2e 2f 77 65 62 2f 77 77 77 2e 66 6f 78 68 61 72 ./web/www.foxhar > 10: 70 2e 62 6f 73 74 6f 6e 2e 6d 61 2e 75 73 2f 74 p.boston.ma.us/t > 20: 2e 68 74 6d 6c .html > > (this appears plausible: the file "index.html" here is > hard-linked to a file named "t.html" elsewhere.) > > - the attrib file contains: > > # /home/backuppc/bin/BackupPC_zcat attrib | nod > 0: 17 55 55 55 0a 69 6e 64 65 78 2e 68 74 6d 6c 01 .UUU.index.html. > 10: 82 83 24 00 00 00 25 3c 33 33 c7 09 76 69 6d 61 ..$...%<33..vima > 20: 6e 2e 67 69 66 00 82 83 24 00 00 00 81 af 38 3b n.gif...$.....8; > 30: ea ac df > > (there are only two files in this directory: the "index.html" > in question, and another called "viman.gif".) > > - if i run BackupPC_tarCreate using the args shown in the restore > log (as you requested pascal do in another thread message), i > get this: > > bash-2.05$ /home/backuppc/bin/BackupPC_tarCreate -h firethorn -n 5 -s /var2 -t -r /web -p /web/ /web/www.vile.cx | tar -tvf - > /home/backuppc/bin/BackupPC_tarCreate: bad share or directory '/var2/./web/www.foxharp.boston.ma.us/t.html' > drwxr-xr-x root/root 0 2004-07-29 20:01:53 ./web/www.vile.cx/ > -rw-r--r-- root/root 22456 2001-11-08 11:03:43 ./web/www.vile.cx/viman.gif > Done: 1 files, 22456 bytes, 1 dirs, 0 specials, 1 errors > > the path being complained about is a correct path on the original > host to the "mate" of the file being restored, but it is not a correct > path on the backup server. > > hope this excessive detail helps, and/or that this has already been > fixed and i missed it... > > paul > > > > > > > Craig, > > > > Sorry for my silence. I was very busy. > > > > So I have installer v 2.1.0 and I have tried this patch (with juste little difference : some shifts of line). > > But it don't solve problem. > > > > On Web interface, when I click on file of type "hardlink" I have the same error : > > "Error: Can't restore bad file <relative path to file hardlinked>" > > The "<relative path to file hardlinked>" is the file referenced in the hardlinked file. > > The error stands with and without the patch. > > > > When I will have more time, I will analyse the problem in depth... > > > > Thanks for your time you have taken. > > > > Pascal Schelcher. > > > > > Here's a patch to try, relative to 2.1.0 (this diff also includes a > > > fix already in 2.1.0pl1). Please tell me if this fixes your problem. > > > > > > Craig > > > > > > --- bin/BackupPC_tarCreate 2004-06-19 19:28:08.000000000 -0700 > > > +++ bin/BackupPC_tarCreate 2004-09-12 02:59:04.243206400 -0700 > > > @@ -230,6 +230,11 @@ > > > my $fh = @_; > > > foreach my $hdr ( @HardLinks ) { > > > $hdr->{size} = 0; > > > + my $name = $hdr->{linkname}; > > > + $name =~ s{^\./}{/}; > > > + if ( defined($HardLinkExtraFiles{$name}) ) { > > > + $hdr->{linkname} = $HardLinkExtraFiles{$name}; > > > + } > > > if ( defined($PathRemove) > > > && substr($hdr->{linkname}, 0, length($PathRemove)+1) > > > eq ".$PathRemove" ) { > > > @@ -423,10 +428,28 @@ > > > TarWriteFileInfo($fh, $hdr); > > > my($data, $size); > > > while ( $f->read(\$data, $BufSize) > 0 ) { > > > + if ( $size + length($data) > $hdr->{size} ) { > > > + print(STDERR "Error: truncating $hdr->{fullPath} to" > > > + . " $hdr->{size} bytes\n"); > > > + $data = substr($data, 0, $hdr->{size} - $size); > > > + $ErrorCnt++; > > > + } > > > TarWrite($fh, \$data); > > > $size += length($data); > > > } > > > $f->close; > > > + if ( $size != $hdr->{size} ) { > > > + print(STDERR "Error: padding $hdr->{fullPath} to $hdr->{size}" > > > + . " bytes from $size bytes\n"); > > > + $ErrorCnt++; > > > + while ( $size < $hdr->{size} ) { > > > + my $len = $hdr->{size} - $size; > > > + $len = $BufSize if ( $len > $BufSize ); > > > + $data = "\0" x $len; > > > + TarWrite($fh, \$data); > > > + $size += $len; > > > + } > > > + } > > > TarWritePad($fh, $size); > > > $FileCnt++; > > > $ByteCnt += $size; > > > @@ -456,7 +479,7 @@ > > > my $done = 0; > > > my $name = $hdr->{linkname}; > > > $name =~ s{^\./}{/}; > > > - if ( $HardLinkExtraFiles{$name} ) { > > > + if ( defined($HardLinkExtraFiles{$name}) ) { > > > $done = 1; > > > } else { > > > foreach my $arg ( @ARGV ) { > > > @@ -478,7 +501,9 @@ > > > # routine, so that we save the hassle of dealing with > > > # mangling, merging and attributes. > > > # > > > - $HardLinkExtraFiles{$hdr->{linkname}} = 1; > > > + my $name = $hdr->{linkname}; > > > + $name =~ s{^\./}{/}; > > > + $HardLinkExtraFiles{$name} = $hdr->{name}; > > > archiveWrite($fh, $hdr->{linkname}, $hdr->{name}); > > > } > > > } elsif ( $hdr->{type} == BPC_FTYPE_SYMLINK ) { > > > > > > > > > ------------------------------------------------------- > > This SF.Net email is sponsored by: YOU BE THE JUDGE. Be one of 170 > > Project Admins to receive an Apple iPod Mini FREE for your judgement on > > who ports your project to Linux PPC the best. Sponsored by IBM. > > Deadline: Sept. 24. Go here: http://sf.net/ppc_contest.php > > _______________________________________________ > > BackupPC-users mailing list > > Bac...@li... > > https://lists.sourceforge.net/lists/listinfo/backuppc-users > > http://backuppc.sourceforge.net/ > > =--------------------- > paul fox, pg...@fo... (arlington, ma, where it's 22.1 degrees) > > > ------------------------------------------------------- > This SF.Net email is sponsored by: IntelliVIEW -- Interactive Reporting > Tool for open source databases. Create drag-&-drop reports. Save time > by over 75%! Publish reports on the web. Export to DOC, XLS, RTF, etc. > Download a FREE copy at http://www.intelliview.com/go/osdn_nl > _______________________________________________ > BackupPC-users mailing list > Bac...@li... > https://lists.sourceforge.net/lists/listinfo/backuppc-users > http://backuppc.sourceforge.net/ =--------------------- paul fox, pg...@fo... (arlington, ma, where it's 21.7 degrees) |
From: Pascal S. <pas...@ve...> - 2004-09-10 19:07:47
|
Thank you Craig for your answer. So. Yes I use "tar" method to backup and restore files. The backuped server is a Linux OS. I have done a "cat" on one hardlinked file : nothing to see... Here is the output of one with "hexdump" : 00000000 78 5e 05 c1 5b 0a 80 40 08 00 c0 13 b9 16 d4 47 |x^..[..@.......G| 00000010 c7 59 c4 45 41 6a f1 75 fe 66 06 36 56 b0 63 33 |.Y.EAj.u.f.6V.c3| 00000020 25 97 6f ab 00 da 09 a6 91 30 9d 44 9b 71 f9 f7 |%.o......0.D.q..| 00000030 e6 34 5c 4a c2 90 4c 82 e7 7d 3c d7 f8 01 f2 55 |.4\J..L..}<....U| 00000040 16 a3 |..| Is this output correct ? Or maybe it can not be read like this ? Maybe my filesystem is corrupted ? I have run BackupPC_tarCreate with same argument like with CGI to restore one file (a hardlinked file). Here is the output : Done: 0 files, 0 bytes, 0 dirs, 0 specials, 0 errors And for comparison here is output with one file (non hardlinked file) : (relative path of this file)0100600000011400000140000000250410015366406012652 0ustar mail (follow by the content of the file which is an e-mail) What is the sequence "0100600000011400000140000000250410015366406012652 0ustar mail" ? I think I will trace a backup. I will do it on monday. Thank you for your help. Pascal. ----- Original Message ----- From: "Craig Barratt" <cba...@us...> To: "Pascal Schelcher" <pas...@ve...> Cc: <bac...@li...> Sent: Friday, September 10, 2004 6:52 PM Subject: Re: [BackupPC-users] Hardlinked file on v 2.0.2 release "Pascal Schelcher" writes: > I ask your help about a problem with hardlink files. > > Here is the situation : > I have some file in backups that have "type" marked to "hardlink". > Each file in this situation can not be restored : restore is succes > but no file are effectively restored. And each file in this situation > have a size of approximately 60 bytes. I assume you are using tar. BackupPC represents hardlinks as a text file containing the name of the hardlink target, with meta data denoting the file as a hardlink. So if you look at the file contents you will see the path of the linked-to file. Are you using tar for restore? It should work: the hardlinks should be represented correctly in the tar file and then extracted correctly. Tricky cases where the linked-to file is not included in the restore should also work, but these cases are less tested. Can you be more specific about what you are restoring, whether the linked-to file is included, etc? I need to recreate this. You could then run BackupPC_tarCreate with the relevant arguments (look at the top of the RestoreLOG file) and pipe the output into tar tvf -. Email me the output. Note that rsync in BackupPC doesn't yet support hardlinks - it will do in the next version. So if you are using rsync to restore a backup done with tar, then hardlinks will not be restored. Craig |