[Libpgf-devel] Improvement to bitplane decoder
libPGF is an implementation of the Progressive Graphics File (PGF)
Brought to you by:
c_stamm
|
From: Jan W. <was...@fo...> - 2008-01-09 13:03:07
|
Dear developers,
thank you for providing libPGF in open-source form!
I am endeavouring to speed up loading of large (200 MB) images by
decompressing them on the fly during IO.
While PGF is quite fast compared to JPEG-LS, its decompression =
throughput
turns out to be slower than reading from the hard-drive. Unfortunately =
that
makes it unsuitable for the current application, but I'd like to pass on
several minor changes that were observed to improve performance on this
Athlon x2 system:
- Decoder.h SetSign: OR-in the sign bit directly (avoids a conditional
branch)
void SetSign(UINT32 pos, bool sign)
{
pos |=3D ((UINT32)sign) << 31;
// if (sign && m_value[pos] >=3D 0)
// m_value[pos] =3D -m_value[pos];
}
- BitStream.h SeekBitRange: use pointer instead of base/displacement
addressing, restructure loop to reduce branches, make use of integer
overflow when left-shifting
inline UINT32 SeekBitRange(UINT32* stream, UINT32 pos, UINT32 len) {
UINT32 count =3D 0;
UINT32 testMask =3D 1 << (pos%WordWidth);
UINT32* word =3D stream + (pos >> WordWidthLog);
for(;;)
{
if(*word & testMask)
return count;
count++;
if(count >=3D len)
return len;
testMask *=3D 2;
if(!testMask) // overflow
{
word++; testMask =3D 1;
// fast steps if all bits in a word are zero
while ((count + WordWidth <=3D len) && (*word =3D=3D 0)) {
word++;=20
count +=3D WordWidth;
}
}
}
}
Hopefully this is helpful.
Do you know of any efforts to sacrifice compression ratios for =
decompression
throughput that is competitive with hard drives? My layman's impression =
is
that simple LZ compression of the wavelet coefficients (analogous to =
PNG's
zlib compression of Paeth-filtered pixels) may be successful. Has this =
been
tried during development?
Best Regards
Jan Wassenberg
|