It was impossible to create PARs for files larger than 2GB
on Windows. This small patch fixes this bug.
Logged In: NO
sorry, forgot to attach .diff file... here it is:
--- par2cmdline.h.orig 2005-02-10 13:25:22.000000000
+++ par2cmdline.h 2005-02-10 13:25:39.000000000
@@ -38,7 +38,9 @@
#define snprintf _snprintf
-#define stat _stat
+#define stat __stat64
+#define __stat64(a, b) _stat64(a, b)
#define __LITTLE_ENDIAN 1234
#define __BIG_ENDIAN 4321
Does anyone have a compiled binary with this fix?
David A. Gatwood
Logged In: YES
It is also wrong for non-Win32. I was pretty annoyed with UnRarX wouldn't
work in Mac OS X because of bugs in this thing. The bug is:
#define OffsetType long
#define MaxOffset 0x7fffffffL
#define LengthType unsigned int
#define MaxLength 0xffffffffUL
Something like this should be much more useful in operating systems like
Mac OS X that support 64-bit stat/seek/*:
#define OffsetType off_t
#define MaxOffset (sizeof(off_t) == 8 ? 0x7fffffffffffffffL : 0x7fffffffL)
#define LengthType fpos_t
#define MaxLength (sizeof(fpos_t) == 8 ? 0xffffffffffffffffL : 0xffffffffL)
For other OSes, you need to add a test for things like fseek64 and use that if
Oops. Sorry, those should have had a LL or ULL suffix for the 64-bit values. Oh,
and a UL for the 0xffffffff, though it has that already. Typo on my part.
Ouch. There are a lot more mistakes than that on the UNIX side of things. The
code also incorrectly uses size_t which is always 32 bits on a 32-bit UNIX
platform even if it can support 64-bit files. Odds are, everywhere size_t is used,
it should be universally replaced with off_t throughout the entire program.
I'm wrong. The uses of size_t are fine. I'm getting some bogus compiler
warnings, but they do appear to be bogus.
Well, two mistakes with this. First, need to change fseek to fseeko in all three
places that it is used in diskfile.cpp, since fseek takes a 32-bit long.
Second, I think I was wrong about the change to LengthType and MaxLength.
I'm pretty sure only OffsetType and MaxOffset need to change.
After making those fixes, I have verified that it can, in fact, fix a corrupted
archive of well over 4 GB in size.