Thread: [Algorithms] Comparing apples to apples : compression libraries
Brought to you by:
vexxed72
|
From: John W. R. <jo...@si...> - 2008-03-25 23:29:29
|
With pretty much every developer there comes a time when you need to use a compression library; not because you want to 'zip' up a massive archive but more because you want to compress some game data on the fly before you transmit it over the network, or to shrink the size of a local cache or something. And, like pretty much every developer, you end up wandering through a maze of open source libraries on the internet, each of which has entirely different API, source code layout, and license agreement. Just figuring out to call 'compressData' (if you even can) is often hours of wading through documentation and dealing with build/configuration issues. For your convenience I am releasing a small code snippet that I wrote when I was trying to evaluate a replacement compression library than the one we are currently using. The simple fact of the matter is that I don't believe any other programmer should have to waste the same amount of stupid time I did wrestling these libraries into a convenient form. http://www.amillionpixels.us/test_compression_1.0.exe Here is the complete API to my wrapper interface: // Block compression/decompression library wrapper by John W. Ratcliff http://www.codesuppository.blogspot.com/ enum CompressionType { CT_INVALID, CT_CRYPTO_GZIP, // The CryptoPP library implementation of GZIP http://www.cryptopp.com CT_MINILZO, // The MiniLZO library http://www.oberhumer.com/opensource/lzo/ CT_ZLIB, // The ZLIB library http://www.zlib.net/ CT_BZIP, // The BZIP library http://www.bzip.org/ }; void * compressData(const void *source,int len,int &outlen,CompressionType type=CT_ZLIB); void * decompressData(const void *source,int clen,int &outlen); void deleteData(void* mem); CompressionType getCompressionType(const void *mem,int len); const char *getCompressionTypeString(CompressionType type); }; This release supports four open source compression libraries. The CRYPTOPP implementation of GZIP, MINILZO, ZLIB, and BZIP. This API simply supports block memory compression and decompression. You can compress a block of memory using any one of the four compressors. The compressed memory has a small header on it that indicates which compressor was used, the size of the uncompressed memory block, and a CRC so that it can easily and safely decompressed. The test application, called 'test_compression', loads a roughly 10mb XML file and runs it through each compressor and decompressor and measures the performance characteristcs of each. The results are as follows: Testing Compression rate and speed with various compressors. --------------------------------------------------------------- Compress:CT_CRYPTO :FROM: 10,436,335 TO: 2,498,433 23% 1,170 MS Compress:CT_MINILZO :FROM: 10,436,335 TO: 3,940,072 37% 97 MS Compress:CT_ZLIB :FROM: 10,436,335 TO: 3,299,771 31% 157 MS Compress:CT_BZIP :FROM: 10,436,335 TO: 2,270,695 21% 1,544 MS --------------------------------------------------------------- Testing Decompression speed with various decompressors. --------------------------------------------------------------- Decompress:CT_CRYPTO :FROM: 2,498,433 TO: 10,436,335 258 MS Decompress:CT_MINILZO :FROM: 3,940,072 TO: 10,436,335 42 MS Decompress:CT_ZLIB :FROM: 3,299,771 TO: 10,436,335 69 MS Decompress:CT_BZIP :FROM: 2,270,695 TO: 10,436,335 390 MS As expected, MINILZO is the fastest. Unfortunately MINILZO uses a GPL license and cannot be used in commercial products. The rest have licenses which are more flexible. Check the links for each package to see if it is right for you. I did include the 'CryptoPP' library for completeness though but, to be frank, I am not very impressed with this package as the compression and decompression rates are very poor. As you would expect BZIP gets the best compression rate but does not perform as quickly. I am fairly impressed with ZLIB, especially because you can use it in a streaming mode as well, which is excellent for network communications layers. (FYI, the assembly language version of the ZLIB decompressor is hardly any faster than the optimized C code and was not included here.) You might wonder why I'm even bothering to post this little snippet. Beside the fact that I hope to save other programmers a little bit of time in the future, I encourage any other developers of compression libraries to drop them into this framework so we can stop comparing apples to oranges when it comes to these technologies. A large XML file is a typical use case for the kind of data a game developer might want to squeeze out some space savings for. I am also happy to modify the installer and include more standardized sample data if that is relevant. I started to add in support for the LZMA library, however they only offer a simple sample decompressor while the compression code is a lot more difficult to extract into a single C style API call. This demo comes with an easy to use installer and a solution and project file for visual studio 2005. All of the compression code itself is multi-platform and only the little demo app makes any OS specific calls. The libraries are each included as raw source, each located in their own directory. None of the source has been removed (except for test code) so you are not required to use the compression libraries via the wrapper layer. If anyone feels compelled to add additional compression libraries to this test framework please let me know and I will make a point to include it in a new drop. Thanks, I hope somebody finds this useful. John |
|
From: cat s. <ga...@gm...> - 2008-03-26 01:03:10
|
Can it support a 7-ZIP algorithm? 2008/3/26, John W. Ratcliff <jo...@si...>: > > With pretty much every developer there comes a time when you need to use > a compression library; not because you want to 'zip' up a massive archive > but more because you want to compress some game data on the fly before you > transmit it over the network, or to shrink the size of a local cache or > something. And, like pretty much every developer, you end up wandering > through a maze of open source libraries on the internet, each of which has > entirely different API, source code layout, and license agreement. Just > figuring out to call 'compressData' (if you even can) is often hours of > wading through documentation and dealing with build/configuration issues. > > > > For your convenience I am releasing a small code snippet that I wrote when > I was trying to evaluate a replacement compression library than the one we > are currently using. The simple fact of the matter is that I don't believe > any other programmer should have to waste the same amount of stupid time I > did wrestling these libraries into a convenient form. > > > > http://www.amillionpixels.us/test_compression_1.0.exe > > > > > > Here is the complete API to my wrapper interface: > > > > // Block compression/decompression library wrapper by John W. Ratcliff > http://www.codesuppository.blogspot.com/ > > > > enum CompressionType > > { > > CT_INVALID, > > CT_CRYPTO_GZIP, // The CryptoPP library implementation of GZIP > http://www.cryptopp.com > > CT_MINILZO, // The MiniLZO library > http://www.oberhumer.com/opensource/lzo/ > > CT_ZLIB, // The ZLIB library http://www.zlib.net/ > > CT_BZIP, // The BZIP library http://www.bzip.org/ > > }; > > > > void * compressData(const void *source,int len,int &outlen, > CompressionType type=CT_ZLIB); > > void * decompressData(const void *source,int clen,int &outlen); > > void deleteData(void* mem); > > > > CompressionType getCompressionType(const void *mem,int len); > > const char *getCompressionTypeString(CompressionType type); > > > > }; > > > > This release supports four open source compression libraries. The > CRYPTOPP implementation of GZIP, MINILZO, ZLIB, and BZIP. > > > > This API simply supports block memory compression and decompression. You > can compress a block of memory using any one of the four compressors. The > compressed memory has a small header on it that indicates which compressor > was used, the size of the uncompressed memory block, and a CRC so that it > can easily and safely decompressed. > > > > The test application, called 'test_compression', loads a roughly 10mb XML > file and runs it through each compressor and decompressor and measures the > performance characteristcs of each. The results are as follows: > > > > > > Testing Compression rate and speed with various compressors. > > --------------------------------------------------------------- > > Compress:CT_CRYPTO :FROM: 10,436,335 TO: 2,498,433 23% 1,170 MS > > Compress:CT_MINILZO :FROM: 10,436,335 TO: 3,940,072 37% 97 MS > > Compress:CT_ZLIB :FROM: 10,436,335 TO: 3,299,771 31% 157 MS > > Compress:CT_BZIP :FROM: 10,436,335 TO: 2,270,695 21% 1,544 MS > > --------------------------------------------------------------- > > Testing Decompression speed with various decompressors. > > --------------------------------------------------------------- > > Decompress:CT_CRYPTO :FROM: 2,498,433 TO: 10,436,335 258 MS > > Decompress:CT_MINILZO :FROM: 3,940,072 TO: 10,436,335 42 MS > > Decompress:CT_ZLIB :FROM: 3,299,771 TO: 10,436,335 69 MS > > Decompress:CT_BZIP :FROM: 2,270,695 TO: 10,436,335 390 MS > > > > As expected, MINILZO is the fastest. Unfortunately MINILZO uses a GPL > license and cannot be used in commercial products. The rest have licenses > which are more flexible. Check the links for each package to see if it is > right for you. I did include the 'CryptoPP' library for completeness though > but, to be frank, I am not very impressed with this package as the > compression and decompression rates are very poor. As you would expect BZIP > gets the best compression rate but does not perform as quickly. > > > > I am fairly impressed with ZLIB, especially because you can use it in a > streaming mode as well, which is excellent for network communications > layers. (FYI, the assembly language version of the ZLIB decompressor is > hardly any faster than the optimized C code and was not included here.) > > > > You might wonder why I'm even bothering to post this little snippet. > Beside the fact that I hope to save other programmers a little bit of time > in the future, I encourage any other developers of compression libraries to > drop them into this framework so we can stop comparing apples to oranges > when it comes to these technologies. A large XML file is a typical use case > for the kind of data a game developer might want to squeeze out some space > savings for. I am also happy to modify the installer and include more > standardized sample data if that is relevant. > > > > I started to add in support for the LZMA library, however they only offer > a simple sample decompressor while the compression code is a lot more > difficult to extract into a single C style API call. > > > > This demo comes with an easy to use installer and a solution and project > file for visual studio 2005. All of the compression code itself is > multi-platform and only the little demo app makes any OS specific calls. > The libraries are each included as raw source, each located in their own > directory. None of the source has been removed (except for test code) so > you are not required to use the compression libraries via the wrapper layer. > > > > If anyone feels compelled to add additional compression libraries to this > test framework please let me know and I will make a point to include it in a > new drop. > > > > Thanks, I hope somebody finds this useful. > > > > > > John > > ------------------------------------------------------------------------- > Check out the new SourceForge.net Marketplace. > It's the best place to buy or sell services for > just about anything Open Source. > > http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace > _______________________________________________ > GDAlgorithms-list mailing list > GDA...@li... > https://lists.sourceforge.net/lists/listinfo/gdalgorithms-list > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_name=gdalgorithms-list > |
|
From: John R. <jra...@in...> - 2008-03-26 03:13:25
|
>>Can it support a 7-ZIP algorithm? Can it? Yes. Does it? No. Are you volunteering to add support for that codec? Thanks! John |
|
From: Tom F. <tom...@ee...> - 2008-03-26 17:36:19
|
One to add to the list could be Sean Barretts version of ZLIB. http://nothings.org/ and see the link called public domain JPEG decompression, public domain PNG decompression, public domain ZLIB decompression (its a single C file). TomF. From: gda...@li... [mailto:gda...@li...] On Behalf Of John W. Ratcliff Sent: Tuesday, March 25, 2008 4:30 PM To: 'Game Development Algorithms' Subject: [Algorithms] Comparing apples to apples : compression libraries With pretty much every developer there comes a time when you need to use a compression library; not because you want to zip up a massive archive but more because you want to compress some game data on the fly before you transmit it over the network, or to shrink the size of a local cache or something. And, like pretty much every developer, you end up wandering through a maze of open source libraries on the internet, each of which has entirely different API, source code layout, and license agreement. Just figuring out to call compressData (if you even can) is often hours of wading through documentation and dealing with build/configuration issues. For your convenience I am releasing a small code snippet that I wrote when I was trying to evaluate a replacement compression library than the one we are currently using. The simple fact of the matter is that I dont believe any other programmer should have to waste the same amount of stupid time I did wrestling these libraries into a convenient form. http://www.amillionpixels.us/test_compression_1.0.exe Here is the complete API to my wrapper interface: // Block compression/decompression library wrapper by John W. Ratcliff http://www.codesuppository.blogspot.com/ enum CompressionType { CT_INVALID, CT_CRYPTO_GZIP, // The CryptoPP library implementation of GZIP http://www.cryptopp.com CT_MINILZO, // The MiniLZO library http://www.oberhumer.com/opensource/lzo/ CT_ZLIB, // The ZLIB library http://www.zlib.net/ CT_BZIP, // The BZIP library http://www.bzip.org/ }; void * compressData(const void *source,int len,int &outlen,CompressionType type=CT_ZLIB); void * decompressData(const void *source,int clen,int &outlen); void deleteData(void* mem); CompressionType getCompressionType(const void *mem,int len); const char *getCompressionTypeString(CompressionType type); }; This release supports four open source compression libraries. The CRYPTOPP implementation of GZIP, MINILZO, ZLIB, and BZIP. This API simply supports block memory compression and decompression. You can compress a block of memory using any one of the four compressors. The compressed memory has a small header on it that indicates which compressor was used, the size of the uncompressed memory block, and a CRC so that it can easily and safely decompressed. The test application, called test_compression, loads a roughly 10mb XML file and runs it through each compressor and decompressor and measures the performance characteristcs of each. The results are as follows: Testing Compression rate and speed with various compressors. --------------------------------------------------------------- Compress:CT_CRYPTO :FROM: 10,436,335 TO: 2,498,433 23% 1,170 MS Compress:CT_MINILZO :FROM: 10,436,335 TO: 3,940,072 37% 97 MS Compress:CT_ZLIB :FROM: 10,436,335 TO: 3,299,771 31% 157 MS Compress:CT_BZIP :FROM: 10,436,335 TO: 2,270,695 21% 1,544 MS --------------------------------------------------------------- Testing Decompression speed with various decompressors. --------------------------------------------------------------- Decompress:CT_CRYPTO :FROM: 2,498,433 TO: 10,436,335 258 MS Decompress:CT_MINILZO :FROM: 3,940,072 TO: 10,436,335 42 MS Decompress:CT_ZLIB :FROM: 3,299,771 TO: 10,436,335 69 MS Decompress:CT_BZIP :FROM: 2,270,695 TO: 10,436,335 390 MS As expected, MINILZO is the fastest. Unfortunately MINILZO uses a GPL license and cannot be used in commercial products. The rest have licenses which are more flexible. Check the links for each package to see if it is right for you. I did include the CryptoPP library for completeness though but, to be frank, I am not very impressed with this package as the compression and decompression rates are very poor. As you would expect BZIP gets the best compression rate but does not perform as quickly. I am fairly impressed with ZLIB, especially because you can use it in a streaming mode as well, which is excellent for network communications layers. (FYI, the assembly language version of the ZLIB decompressor is hardly any faster than the optimized C code and was not included here.) You might wonder why Im even bothering to post this little snippet. Beside the fact that I hope to save other programmers a little bit of time in the future, I encourage any other developers of compression libraries to drop them into this framework so we can stop comparing apples to oranges when it comes to these technologies. A large XML file is a typical use case for the kind of data a game developer might want to squeeze out some space savings for. I am also happy to modify the installer and include more standardized sample data if that is relevant. I started to add in support for the LZMA library, however they only offer a simple sample decompressor while the compression code is a lot more difficult to extract into a single C style API call. This demo comes with an easy to use installer and a solution and project file for visual studio 2005. All of the compression code itself is multi-platform and only the little demo app makes any OS specific calls. The libraries are each included as raw source, each located in their own directory. None of the source has been removed (except for test code) so you are not required to use the compression libraries via the wrapper layer. If anyone feels compelled to add additional compression libraries to this test framework please let me know and I will make a point to include it in a new drop. Thanks, I hope somebody finds this useful. John |
|
From: John W. R. <jo...@si...> - 2008-03-26 17:55:39
|
Thanks for the links. I'll be sure to check them out. -----Original Message----- From: gda...@li... [mailto:gda...@li...] On Behalf Of Tom Forsyth Sent: Wednesday, March 26, 2008 12:36 PM To: 'Game Development Algorithms' Subject: Re: [Algorithms] Comparing apples to apples : compression libraries One to add to the list could be Sean Barretts version of ZLIB. http://nothings.org/ and see the link called public domain JPEG decompression, public domain PNG decompression, public domain ZLIB decompression (its a single C file). TomF. From: gda...@li... [mailto:gda...@li...] On Behalf Of John W. Ratcliff Sent: Tuesday, March 25, 2008 4:30 PM To: 'Game Development Algorithms' Subject: [Algorithms] Comparing apples to apples : compression libraries With pretty much every developer there comes a time when you need to use a compression library; not because you want to zip up a massive archive but more because you want to compress some game data on the fly before you transmit it over the network, or to shrink the size of a local cache or something. And, like pretty much every developer, you end up wandering through a maze of open source libraries on the internet, each of which has entirely different API, source code layout, and license agreement. Just figuring out to call compressData (if you even can) is often hours of wading through documentation and dealing with build/configuration issues. For your convenience I am releasing a small code snippet that I wrote when I was trying to evaluate a replacement compression library than the one we are currently using. The simple fact of the matter is that I dont believe any other programmer should have to waste the same amount of stupid time I did wrestling these libraries into a convenient form. http://www.amillionpixels.us/test_compression_1.0.exe Here is the complete API to my wrapper interface: // Block compression/decompression library wrapper by John W. Ratcliff http://www.codesuppository.blogspot.com/ enum CompressionType { CT_INVALID, CT_CRYPTO_GZIP, // The CryptoPP library implementation of GZIP http://www.cryptopp.com CT_MINILZO, // The MiniLZO library http://www.oberhumer.com/opensource/lzo/ CT_ZLIB, // The ZLIB library http://www.zlib.net/ CT_BZIP, // The BZIP library http://www.bzip.org/ }; void * compressData(const void *source,int len,int &outlen,CompressionType type=CT_ZLIB); void * decompressData(const void *source,int clen,int &outlen); void deleteData(void* mem); CompressionType getCompressionType(const void *mem,int len); const char *getCompressionTypeString(CompressionType type); }; This release supports four open source compression libraries. The CRYPTOPP implementation of GZIP, MINILZO, ZLIB, and BZIP. This API simply supports block memory compression and decompression. You can compress a block of memory using any one of the four compressors. The compressed memory has a small header on it that indicates which compressor was used, the size of the uncompressed memory block, and a CRC so that it can easily and safely decompressed. The test application, called test_compression, loads a roughly 10mb XML file and runs it through each compressor and decompressor and measures the performance characteristcs of each. The results are as follows: Testing Compression rate and speed with various compressors. --------------------------------------------------------------- Compress:CT_CRYPTO :FROM: 10,436,335 TO: 2,498,433 23% 1,170 MS Compress:CT_MINILZO :FROM: 10,436,335 TO: 3,940,072 37% 97 MS Compress:CT_ZLIB :FROM: 10,436,335 TO: 3,299,771 31% 157 MS Compress:CT_BZIP :FROM: 10,436,335 TO: 2,270,695 21% 1,544 MS --------------------------------------------------------------- Testing Decompression speed with various decompressors. --------------------------------------------------------------- Decompress:CT_CRYPTO :FROM: 2,498,433 TO: 10,436,335 258 MS Decompress:CT_MINILZO :FROM: 3,940,072 TO: 10,436,335 42 MS Decompress:CT_ZLIB :FROM: 3,299,771 TO: 10,436,335 69 MS Decompress:CT_BZIP :FROM: 2,270,695 TO: 10,436,335 390 MS As expected, MINILZO is the fastest. Unfortunately MINILZO uses a GPL license and cannot be used in commercial products. The rest have licenses which are more flexible. Check the links for each package to see if it is right for you. I did include the CryptoPP library for completeness though but, to be frank, I am not very impressed with this package as the compression and decompression rates are very poor. As you would expect BZIP gets the best compression rate but does not perform as quickly. I am fairly impressed with ZLIB, especially because you can use it in a streaming mode as well, which is excellent for network communications layers. (FYI, the assembly language version of the ZLIB decompressor is hardly any faster than the optimized C code and was not included here.) You might wonder why Im even bothering to post this little snippet. Beside the fact that I hope to save other programmers a little bit of time in the future, I encourage any other developers of compression libraries to drop them into this framework so we can stop comparing apples to oranges when it comes to these technologies. A large XML file is a typical use case for the kind of data a game developer might want to squeeze out some space savings for. I am also happy to modify the installer and include more standardized sample data if that is relevant. I started to add in support for the LZMA library, however they only offer a simple sample decompressor while the compression code is a lot more difficult to extract into a single C style API call. This demo comes with an easy to use installer and a solution and project file for visual studio 2005. All of the compression code itself is multi-platform and only the little demo app makes any OS specific calls. The libraries are each included as raw source, each located in their own directory. None of the source has been removed (except for test code) so you are not required to use the compression libraries via the wrapper layer. If anyone feels compelled to add additional compression libraries to this test framework please let me know and I will make a point to include it in a new drop. Thanks, I hope somebody finds this useful. John ------------------------------------------------------------------------- Check out the new SourceForge.net Marketplace. It's the best place to buy or sell services for just about anything Open Source. http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace _______________________________________________ GDAlgorithms-list mailing list GDA...@li... https://lists.sourceforge.net/lists/listinfo/gdalgorithms-list Archives: http://sourceforge.net/mailarchive/forum.php?forum_name=gdalgorithms-list |
|
From: Jon W. <hp...@mi...> - 2008-03-26 17:53:34
|
John W. Ratcliff wrote: > > internet, each of which has entirely different API, source code > layout, and license agreement. Just figuring out to call > ‘compressData’ (if you even can) is often hours of wading through > documentation and dealing with build/configuration issues. > Did you check out the work of Ross Williams from the early '90s? He also proposed a standard compression API, and implemented some different algorithms to that API. I believe, at the time, there were some patent problems with his approach, but that may now have subsided. The interesting thing about his code was that it was focused towards in-core, on-the-fly usage, with a streaming interface and low-ish memory overhead. Sincerely, jw |
|
From: Glen M. <gle...@di...> - 2008-03-26 20:35:06
|
I'm curious what hardware this was tested on. My experiments with zlib on consoles was very disappointing. peace "John W. Ratcliff" <jo...@si...> wrote in message news:004f01c88ed0 Testing Compression rate and speed with various compressors. --------------------------------------------------------------- Compress:CT_CRYPTO :FROM: 10,436,335 TO: 2,498,433 23% 1,170 MS Compress:CT_MINILZO :FROM: 10,436,335 TO: 3,940,072 37% 97 MS Compress:CT_ZLIB :FROM: 10,436,335 TO: 3,299,771 31% 157 MS Compress:CT_BZIP :FROM: 10,436,335 TO: 2,270,695 21% 1,544 MS --------------------------------------------------------------- Testing Decompression speed with various decompressors. --------------------------------------------------------------- Decompress:CT_CRYPTO :FROM: 2,498,433 TO: 10,436,335 258 MS Decompress:CT_MINILZO :FROM: 3,940,072 TO: 10,436,335 42 MS Decompress:CT_ZLIB :FROM: 3,299,771 TO: 10,436,335 69 MS Decompress:CT_BZIP :FROM: 2,270,695 TO: 10,436,335 390 MS |
|
From: John R. <jra...@in...> - 2008-03-26 21:16:01
|
This was on a very high end PC. I'm as much interested in the relative differences than anything else which I think would be roughly the same on consoles. If you any other codecs to suggest, I will be happy to add them, so long as the integration isn't too obnoxious. John Glen Miner writes: > I'm curious what hardware this was tested on. My experiments with zlib on > consoles was very disappointing. > > peace > > "John W. Ratcliff" <jo...@si...> wrote in message > news:004f01c88ed0 > > Testing Compression rate and speed with various compressors. > --------------------------------------------------------------- > Compress:CT_CRYPTO :FROM: 10,436,335 TO: 2,498,433 23% 1,170 MS > Compress:CT_MINILZO :FROM: 10,436,335 TO: 3,940,072 37% 97 MS > Compress:CT_ZLIB :FROM: 10,436,335 TO: 3,299,771 31% 157 MS > Compress:CT_BZIP :FROM: 10,436,335 TO: 2,270,695 21% 1,544 MS > --------------------------------------------------------------- > Testing Decompression speed with various decompressors. > --------------------------------------------------------------- > Decompress:CT_CRYPTO :FROM: 2,498,433 TO: 10,436,335 258 MS > Decompress:CT_MINILZO :FROM: 3,940,072 TO: 10,436,335 42 MS > Decompress:CT_ZLIB :FROM: 3,299,771 TO: 10,436,335 69 MS > Decompress:CT_BZIP :FROM: 2,270,695 TO: 10,436,335 390 MS > > > > > ------------------------------------------------------------------------- > Check out the new SourceForge.net Marketplace. > It's the best place to buy or sell services for > just about anything Open Source. > http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace > _______________________________________________ > GDAlgorithms-list mailing list > GDA...@li... > https://lists.sourceforge.net/lists/listinfo/gdalgorithms-list > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_name=gdalgorithms-list |
|
From: Paul at H. <pa...@ru...> - 2008-03-26 22:11:04
|
That would be a bad assumption to make, sadly. The memory is so piss poor on consoles that L2 misses are more akin to reading from a particularly bad DVD than actual ram. A miss on X360 for example costs about 600 cycles. Think how much you could acheive in that time, and that's for every single miss. Therefore, doing more math and calculating values over and over instead of reading from say, a large hashing table, would give massive wins. So much so that you just can't compare with a PC in terms of relative speeds. If the PC code stays in the cache and the console code doesn't, you could be out by orders of magnitude. Comparing assumed timings on console vs pc just by looking at the CPU speed is like working out who the worst dressed guy in the office is based on what you all had for dinner last week. :) Regards, Paul Johnson. www.rubicondev.com ----- Original Message ----- From: "John Ratcliff" <jra...@in...> To: "Game Development Algorithms" <gda...@li...> Sent: Wednesday, March 26, 2008 9:16 PM Subject: Re: [Algorithms] Comparing apples to apples : compression libraries > This was on a very high end PC. I'm as much interested in the relative > differences than anything else which I think would be roughly the same on > consoles. If you any other codecs to suggest, I will be happy to add > them, > so long as the integration isn't too obnoxious. > > John > > Glen Miner writes: > >> I'm curious what hardware this was tested on. My experiments with zlib on >> consoles was very disappointing. >> >> peace >> >> "John W. Ratcliff" <jo...@si...> wrote in message >> news:004f01c88ed0 >> >> Testing Compression rate and speed with various compressors. >> --------------------------------------------------------------- >> Compress:CT_CRYPTO :FROM: 10,436,335 TO: 2,498,433 23% 1,170 MS >> Compress:CT_MINILZO :FROM: 10,436,335 TO: 3,940,072 37% 97 MS >> Compress:CT_ZLIB :FROM: 10,436,335 TO: 3,299,771 31% 157 MS >> Compress:CT_BZIP :FROM: 10,436,335 TO: 2,270,695 21% 1,544 MS >> --------------------------------------------------------------- >> Testing Decompression speed with various decompressors. >> --------------------------------------------------------------- >> Decompress:CT_CRYPTO :FROM: 2,498,433 TO: 10,436,335 258 MS >> Decompress:CT_MINILZO :FROM: 3,940,072 TO: 10,436,335 42 MS >> Decompress:CT_ZLIB :FROM: 3,299,771 TO: 10,436,335 69 MS >> Decompress:CT_BZIP :FROM: 2,270,695 TO: 10,436,335 390 MS >> >> >> >> >> ------------------------------------------------------------------------- >> Check out the new SourceForge.net Marketplace. >> It's the best place to buy or sell services for >> just about anything Open Source. >> http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace >> _______________________________________________ >> GDAlgorithms-list mailing list >> GDA...@li... >> https://lists.sourceforge.net/lists/listinfo/gdalgorithms-list >> Archives: >> http://sourceforge.net/mailarchive/forum.php?forum_name=gdalgorithms-list > > > > ------------------------------------------------------------------------- > Check out the new SourceForge.net Marketplace. > It's the best place to buy or sell services for > just about anything Open Source. > http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace > _______________________________________________ > GDAlgorithms-list mailing list > GDA...@li... > https://lists.sourceforge.net/lists/listinfo/gdalgorithms-list > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_name=gdalgorithms-list > > |
|
From: <gd...@er...> - 2008-03-26 22:18:06
|
Hi John, Thanks for the very useful work again and a nice time saver. >>My experiments with zlib on consoles was very disappointing. Which console(s)? Playstation 3 Edge libraries, available to PS3 licensees, have an SPU optimized version for ZLib, that can decompress 40MB/sec using 25% of 1 SPU: From http://forum.beyond3d.com/showpost.php?p=956489&postcount=196 Thanks, Erwin Glen Miner writes: > I'm curious what hardware this was tested on. My experiments with zlib on > consoles was very disappointing. > > peace > > "John W. Ratcliff" <jo...@si...> wrote in message > news:004f01c88ed0 > > Testing Compression rate and speed with various compressors. > --------------------------------------------------------------- > Compress:CT_CRYPTO :FROM: 10,436,335 TO: 2,498,433 23% 1,170 MS > Compress:CT_MINILZO :FROM: 10,436,335 TO: 3,940,072 37% 97 MS > Compress:CT_ZLIB :FROM: 10,436,335 TO: 3,299,771 31% 157 MS > Compress:CT_BZIP :FROM: 10,436,335 TO: 2,270,695 21% 1,544 MS > --------------------------------------------------------------- > Testing Decompression speed with various decompressors. > --------------------------------------------------------------- > Decompress:CT_CRYPTO :FROM: 2,498,433 TO: 10,436,335 258 MS > Decompress:CT_MINILZO :FROM: 3,940,072 TO: 10,436,335 42 MS > Decompress:CT_ZLIB :FROM: 3,299,771 TO: 10,436,335 69 MS > Decompress:CT_BZIP :FROM: 2,270,695 TO: 10,436,335 390 MS > > > > > ------------------------------------------------------------------------- > Check out the new SourceForge.net Marketplace. > It's the best place to buy or sell services for > just about anything Open Source. > http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace > _______________________________________________ > GDAlgorithms-list mailing list > GDA...@li... > https://lists.sourceforge.net/lists/listinfo/gdalgorithms-list > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_name=gdalgorithms-list |
|
From: Jason H. <jas...@di...> - 2008-03-27 04:43:24
|
One can argue that anything optimized for SPU no longer resembles the original library. :-) I've looked over the Huffman format description for ZLib and a few details seemed unclear. LZ is going to perform pretty well regardless since your data is always still in the L2 cache, if not the L1. But you can implement Huffman in many ways, and in my own implementations just changing between a naive bit-shifting model and a cascaded table-driven approach can give you 300% the performance. On PC, where cache sizes are pretty large and OOE and long pipe stalls can hurt. I haven't had a chance to run the same tests on consoles (yet), but I expect cascaded tables with an 8-bit root table would still win. Maybe not on PS2. Not sure how many games are really bottlenecked by their decoders, but if yours is, it might be worth seeing how much you get back by switching Huffman off. I expect a bad implementation can waste a significant number of cycles while streaming. Thanks, JH Disclaimer: I took no part in the ZLib optimizations for Edge. I was just making a funny. gd...@er... wrote: > Hi John, > > Thanks for the very useful work again and a nice time saver. > > >>> My experiments with zlib on consoles was very disappointing. >>> > Which console(s)? > > Playstation 3 Edge libraries, available to PS3 licensees, have an SPU > optimized version for ZLib, that can decompress 40MB/sec using 25% of 1 SPU: > > From http://forum.beyond3d.com/showpost.php?p=956489&postcount=196 > > Thanks, > Erwin > > > > > Glen Miner writes: > > >> I'm curious what hardware this was tested on. My experiments with zlib on >> consoles was very disappointing. >> >> peace >> >> "John W. Ratcliff" <jo...@si...> wrote in message >> news:004f01c88ed0 >> >> Testing Compression rate and speed with various compressors. >> --------------------------------------------------------------- >> Compress:CT_CRYPTO :FROM: 10,436,335 TO: 2,498,433 23% 1,170 MS >> Compress:CT_MINILZO :FROM: 10,436,335 TO: 3,940,072 37% 97 MS >> Compress:CT_ZLIB :FROM: 10,436,335 TO: 3,299,771 31% 157 MS >> Compress:CT_BZIP :FROM: 10,436,335 TO: 2,270,695 21% 1,544 MS >> --------------------------------------------------------------- >> Testing Decompression speed with various decompressors. >> --------------------------------------------------------------- >> Decompress:CT_CRYPTO :FROM: 2,498,433 TO: 10,436,335 258 MS >> Decompress:CT_MINILZO :FROM: 3,940,072 TO: 10,436,335 42 MS >> Decompress:CT_ZLIB :FROM: 3,299,771 TO: 10,436,335 69 MS >> Decompress:CT_BZIP :FROM: 2,270,695 TO: 10,436,335 390 MS >> >> >> >> >> ------------------------------------------------------------------------- >> Check out the new SourceForge.net Marketplace. >> It's the best place to buy or sell services for >> just about anything Open Source. >> http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace >> _______________________________________________ >> GDAlgorithms-list mailing list >> GDA...@li... >> https://lists.sourceforge.net/lists/listinfo/gdalgorithms-list >> Archives: >> http://sourceforge.net/mailarchive/forum.php?forum_name=gdalgorithms-list >> > > > > ------------------------------------------------------------------------- > Check out the new SourceForge.net Marketplace. > It's the best place to buy or sell services for > just about anything Open Source. > http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace > _______________________________________________ > GDAlgorithms-list mailing list > GDA...@li... > https://lists.sourceforge.net/lists/listinfo/gdalgorithms-list > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_name=gdalgorithms-list > > |
|
From: Tobias S. <tsi...@ml...> - 2008-04-02 01:40:46
|
John W. Ratcliff wrote: > As expected, MINILZO is the fastest. Unfortunately MINILZO uses a GPL > license and cannot be used in commercial products. The rest have > licenses which are more flexible. The availability via GPL doesn't automatically mean there aren't other ways to license (at least as long the software in question is not derived from GPLed work). Actually, http://www.oberhumer.com/opensource/lzo/#introduction states: "Special licenses for commercial and other applications are available by contacting the author." Tobi |
|
From: Andy F. <an...@si...> - 2008-04-02 03:06:37
|
Tobias Sicheritz wrote: > John W. Ratcliff wrote: > >> As expected, MINILZO is the fastest. Unfortunately MINILZO uses a GPL >> license and cannot be used in commercial products. The rest have >> licenses which are more flexible. >> > The availability via GPL doesn't automatically mean there aren't other > ways to license (at least as long the software in question is not > derived from GPLed work). > > Actually, http://www.oberhumer.com/opensource/lzo/#introduction states: > "Special licenses for commercial and other applications are available by > contacting the author." > > I've sent a fax, and a couriered letter, and a first class (overseas) letter, and email to the address listed. No response has been forthcoming. This has lasted for well over a year at this point. I've heard of others having the same issue. --andy |
|
From: Boberg, S. <Ste...@di...> - 2008-04-02 07:52:30
|
These guys might be more forthcoming: http://www.quicklz.com/ Compression looks to be on par with LZO, but supposedly somewhat faster. /Stefan From: gda...@li... [mailto:gda...@li...] On Behalf Of Andy Finkenstadt Sent: 02 April 2008 05:07 To: Game Development Algorithms Subject: Re: [Algorithms] Comparing apples to apples : compression libraries Tobias Sicheritz wrote: John W. Ratcliff wrote: As expected, MINILZO is the fastest. Unfortunately MINILZO uses a GPL license and cannot be used in commercial products. The rest have licenses which are more flexible. The availability via GPL doesn't automatically mean there aren't other ways to license (at least as long the software in question is not derived from GPLed work). Actually, http://www.oberhumer.com/opensource/lzo/#introduction states: "Special licenses for commercial and other applications are available by contacting the author." I've sent a fax, and a couriered letter, and a first class (overseas) letter, and email to the address listed. No response has been forthcoming. This has lasted for well over a year at this point. I've heard of others having the same issue. --andy |
|
From: Ralph <sp...@gm...> - 2009-10-14 16:40:16
|
This looks pretty cool! I'm very late to this thread, but, have you checked out LZF (http://oldhome.schmorp.de/marc/liblzf.html) - very quick, good for repeating data. Was easy to get setup on the consoles. We are using it for network compression and it works well enough and is very cheap in term of CPU cost. Licenses available are GPL and BSD (like). Ralph On 3/25/2008 4:30 PM, John W. Ratcliff wrote: > > With pretty much every developer there comes a time when you need to > use a compression library; not because you want to 'zip' up a massive > archive but more because you want to compress some game data on the > fly before you transmit it over the network, or to shrink the size of > a local cache or something. And, like pretty much every developer, > you end up wandering through a maze of open source libraries on the > internet, each of which has entirely different API, source code > layout, and license agreement. Just figuring out to call > 'compressData' (if you even can) is often hours of wading through > documentation and dealing with build/configuration issues. > > For your convenience I am releasing a small code snippet that I wrote > when I was trying to evaluate a replacement compression library than > the one we are currently using. The simple fact of the matter is that > I don't believe any other programmer should have to waste the same > amount of stupid time I did wrestling these libraries into a > convenient form. > > http://www.amillionpixels.us/test_compression_1.0.exe > > Here is the complete API to my wrapper interface: > > // Block compression/decompression library wrapper by John W. Ratcliff > http://www.codesuppository.blogspot.com/ > > enum CompressionType > > { > > CT_INVALID, > > CT_CRYPTO_GZIP, // The CryptoPP library implementation of GZIP > http://www.cryptopp.com > > CT_MINILZO, // The MiniLZO library > http://www.oberhumer.com/opensource/lzo/ > > CT_ZLIB, // The ZLIB library http://www.zlib.net/ > > CT_BZIP, // The BZIP library http://www.bzip.org/ > > }; > > void * compressData(const void *source,int len,int > &outlen,CompressionType type=CT_ZLIB); > > void * decompressData(const void *source,int clen,int &outlen); > > void deleteData(void* mem); > > CompressionType getCompressionType(const void *mem,int len); > > const char *getCompressionTypeString(CompressionType type); > > }; > > This release supports four open source compression libraries. The > CRYPTOPP implementation of GZIP, MINILZO, ZLIB, and BZIP. > > This API simply supports block memory compression and decompression. > You can compress a block of memory using any one of the four > compressors. The compressed memory has a small header on it that > indicates which compressor was used, the size of the uncompressed > memory block, and a CRC so that it can easily and safely decompressed. > > The test application, called 'test_compression', loads a roughly 10mb > XML file and runs it through each compressor and decompressor and > measures the performance characteristcs of each. The results are as > follows: > > Testing Compression rate and speed with various compressors. > > --------------------------------------------------------------- > > Compress:CT_CRYPTO :FROM: 10,436,335 TO: 2,498,433 23% 1,170 MS > > Compress:CT_MINILZO :FROM: 10,436,335 TO: 3,940,072 37% 97 MS > > Compress:CT_ZLIB :FROM: 10,436,335 TO: 3,299,771 31% 157 MS > > Compress:CT_BZIP :FROM: 10,436,335 TO: 2,270,695 21% 1,544 MS > > --------------------------------------------------------------- > > Testing Decompression speed with various decompressors. > > --------------------------------------------------------------- > > Decompress:CT_CRYPTO :FROM: 2,498,433 TO: 10,436,335 258 MS > > Decompress:CT_MINILZO :FROM: 3,940,072 TO: 10,436,335 42 MS > > Decompress:CT_ZLIB :FROM: 3,299,771 TO: 10,436,335 69 MS > > Decompress:CT_BZIP :FROM: 2,270,695 TO: 10,436,335 390 MS > > As expected, MINILZO is the fastest. Unfortunately MINILZO uses a GPL > license and cannot be used in commercial products. The rest have > licenses which are more flexible. Check the links for each package to > see if it is right for you. I did include the 'CryptoPP' library for > completeness though but, to be frank, I am not very impressed with > this package as the compression and decompression rates are very > poor. As you would expect BZIP gets the best compression rate but > does not perform as quickly. > > I am fairly impressed with ZLIB, especially because you can use it in > a streaming mode as well, which is excellent for network > communications layers. (FYI, the assembly language version of the > ZLIB decompressor is hardly any faster than the optimized C code and > was not included here.) > > You might wonder why I'm even bothering to post this little snippet. > Beside the fact that I hope to save other programmers a little bit of > time in the future, I encourage any other developers of compression > libraries to drop them into this framework so we can stop comparing > apples to oranges when it comes to these technologies. A large XML > file is a typical use case for the kind of data a game developer might > want to squeeze out some space savings for. I am also happy to modify > the installer and include more standardized sample data if that is > relevant. > > I started to add in support for the LZMA library, however they only > offer a simple sample decompressor while the compression code is a lot > more difficult to extract into a single C style API call. > > This demo comes with an easy to use installer and a solution and > project file for visual studio 2005. All of the compression code > itself is multi-platform and only the little demo app makes any OS > specific calls. The libraries are each included as raw source, each > located in their own directory. None of the source has been removed > (except for test code) so you are not required to use the compression > libraries via the wrapper layer. > > If anyone feels compelled to add additional compression libraries to > this test framework please let me know and I will make a point to > include it in a new drop. > > Thanks, I hope somebody finds this useful. > > John > > > ------------------------------------------------------------------------- > Check out the new SourceForge.net Marketplace. > It's the best place to buy or sell services for > just about anything Open Source. > http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace > > > _______________________________________________ > GDAlgorithms-list mailing list > GDA...@li... > https://lists.sourceforge.net/lists/listinfo/gdalgorithms-list > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_name=gdalgorithms-list |