Thread: [Libjpeg-turbo-devel] Further research regarding the effectiveness of SmartScale
SIMD-accelerated libjpeg-compatible JPEG codec library
Brought to you by:
dcommander
From: DRC <dco...@us...> - 2013-01-15 05:53:42
|
Since there have been questions fielded from Fedora and others regarding the potential for libjpeg-turbo to support the DCT scaling and SmartScale features of jpeg-7 and later, I felt compelled to do some research into the effectiveness of these new features. The research revealed that DCT scaling and SmartScale do not generally accomplish anything that can't already be accomplished at least as well (and typically faster) using other means: http://www.libjpeg-turbo.org/About/SmartScale Executive summary: -- For generating lossless files, libpng was much faster (3-4x) and achieved similar compression ratios to jpeg-9. -- Reducing the DCT block size (a feature of jpeg-8) did improve visual quality, but it also decreased the compression ratio, so it was necessary to reduce the JPEG quality to compensate for this. The resulting images had either the same or worse perceptual quality than equally-sized high-quality images generated using baseline encoding. -- Using a DCT block size of 1 (best quality) typically increased encoding time by a factor of 4-6 relative to baseline JPEG. Using a block size of 2 typically increased encoding time by a factor of 2-3 relative to baseline JPEG. -- Reducing the DCT block size did not allow a significantly or perceptibly higher maximum quality to be achieved relative to baseline JPEG. -- Using a DCT block size of 1 or 2 did allow maximum quality to be achieved with a higher compression ratio, but the performance of these modes was painfully slow, and as with the lossless case, much better performance and about the same peak compression ratio could be achieved using libpng. -- In no case did reducing the block size provide better compression at the same overall perceptual quality relative to "low-quality JPEG" (quality=30, 4:2:0). -- On photographic content, DCT scaling did produce better compression relative to "low-quality JPEG" at the same overall perceptual quality, but it did this by concentrating the error around sharp features, which is precisely where you don't want the error to be. I evaluated these technologies partly with a mind for their potential usefulness in VirtualGL and TurboVNC, since that's one place where funding for an implementation of them in libjpeg-turbo could come from. What I found was that probably the biggest piece of low-hanging fruit is accelerating the arithmetic codec, since arithmetic coding typically increased the compression ratio by about 50% relative to Huffman. If it could be optimized in the same way that the Huffman codec has been optimized, it might be potentially very interesting for remote display applications. Otherwise, it is my opinion that SmartScale and DCT scaling do not provide any viable substitute for or improvement upon the existing "usable" modes of baseline JPEG. I don't claim that my research is universal, but I do claim that it is probably the most thorough study out there on this topic, since my reason for writing it was partly motivated by my inability to find any such information from another source. Although I found no real usefulness for DCT scaling, in and of itself it's a harmless feature, since it works within the existing baseline JPEG standard. However, at the moment, I am opposed to any implementation of the SmartScale format, since it introduces a new, non-standard format whose usefulness has now been demonstrated to be very minimal at best. In my opinion, anyone who upgrades to jpeg-9 is doing so simply because they are blindly pulling in the latest & greatest code, not because there is any technological need for this new release. In fact, not only does it break ABI compatibility with jpeg-8, but it introduces yet another new non-standard format. Disagree? Chime in. DRC |
From: Siarhei S. <sia...@gm...> - 2013-01-15 06:48:09
|
On Mon, 14 Jan 2013 23:53:31 -0600 DRC <dco...@us...> wrote: > Since there have been questions fielded from Fedora and others regarding > the potential for libjpeg-turbo to support the DCT scaling and > SmartScale features of jpeg-7 and later, I felt compelled to do some > research into the effectiveness of these new features. The research > revealed that DCT scaling and SmartScale do not generally accomplish > anything that can't already be accomplished at least as well (and > typically faster) using other means: > > http://www.libjpeg-turbo.org/About/SmartScale > > Executive summary: > > -- For generating lossless files, libpng was much faster (3-4x) and > achieved similar compression ratios to jpeg-9. By the way, there is a paper with this lossless JPEG extension proposal here with some compression ratio comparison tables: http://jpegclub.org/temp/JPEG_9_Lossless_Coding.doc But when introducing a new and incompatible format, I think one has to actually beat WebP and not PNG nowadays: http://blog.chromium.org/2012/08/lossless-and-transparency-modes-in-webp.html -- Best regards, Siarhei Siamashka |
From: DRC <dco...@us...> - 2013-01-15 19:40:46
|
On 1/15/13 12:47 AM, Siarhei Siamashka wrote: > By the way, there is a paper with this lossless JPEG extension proposal > here with some compression ratio comparison tables: > http://jpegclub.org/temp/JPEG_9_Lossless_Coding.doc > > But when introducing a new and incompatible format, I think one has to > actually beat WebP and not PNG nowadays: > http://blog.chromium.org/2012/08/lossless-and-transparency-modes-in-webp.html OK, I added webp tests to the document: http://www.libjpeg-turbo.org/About/SmartScale-Lossless I used 'cwebp -lossless -q 0 -m 0', but I'm not totally sure that's correct. The images I'm working with are a lot bigger than the ones that Google worked with in this paper: https://developers.google.com/speed/webp/docs/webp_lossless_alpha_study, so using the default cwebp settings wasn't possible. I waited for many minutes, and the first image hadn't finished compressing yet, so I aborted it. At any rate, even with what are supposed to be the "high-performance, low-compression" settings in webp, it definitely did compress better than PNG or jpeg-9 in all cases, and its performance wasn't much off of jpeg-9 in two of the cases. |
From: Siarhei S. <sia...@gm...> - 2013-01-15 11:12:13
|
On Tue, 15 Jan 2013 08:47:56 +0200 Siarhei Siamashka <sia...@gm...> wrote: > On Mon, 14 Jan 2013 23:53:31 -0600 > DRC <dco...@us...> wrote: > > > Since there have been questions fielded from Fedora and others regarding > > the potential for libjpeg-turbo to support the DCT scaling and > > SmartScale features of jpeg-7 and later, I felt compelled to do some > > research into the effectiveness of these new features. The research > > revealed that DCT scaling and SmartScale do not generally accomplish > > anything that can't already be accomplished at least as well (and > > typically faster) using other means: > > > > http://www.libjpeg-turbo.org/About/SmartScale > > > > Executive summary: > > > > -- For generating lossless files, libpng was much faster (3-4x) and > > achieved similar compression ratios to jpeg-9. > > By the way, there is a paper with this lossless JPEG extension proposal > here with some compression ratio comparison tables: > http://jpegclub.org/temp/JPEG_9_Lossless_Coding.doc > > But when introducing a new and incompatible format, I think one has to > actually beat WebP and not PNG nowadays: > http://blog.chromium.org/2012/08/lossless-and-transparency-modes-in-webp.html Actually it is quite interesting that JPEG 9 filter resembles a very similar SUBTRACT_GREEN filter from WebP: https://developers.google.com/speed/webp/docs/webp_lossless_bitstream_specification#transformations The only difference is that libjpeg-9 also adds CENTERVAL (0x80) to red and green, while WebP doesn't. I took 24 images from http://www.r0k.us/graphics/kodak/kodak/ for testing and tried to hack this tweak into WebP 0.2.1 with the following results. The numbers after file names are the sizes when re-encoded with 1) JPEG 9 (cjpeg –rgb1 –block 1 –arithmetic) 2) WebP 0.2.1 (cwebp -m 6 -lossless) 3) patched WebP 0.2.1 to use modified filter from JPEG 9 (1) (2) (3) kodim01.png 586755 504672 503038 kodim02.png 545688 455562 454684 kodim03.png 439830 386634 384950 kodim04.png 556177 460240 456204 kodim05.png 640895 565768 549250 kodim06.png 501072 470044 469188 kodim07.png 483621 419638 416180 kodim08.png 672957 551580 548236 kodim09.png 493393 437996 436106 kodim10.png 511781 445064 440882 kodim11.png 513196 457606 455696 kodim12.png 450743 411542 408418 kodim13.png 656770 607078 594714 kodim14.png 567437 517324 516012 kodim15.png 532043 433094 428576 kodim16.png 443120 424902 423042 kodim17.png 495714 447794 443550 kodim18.png 649763 567544 553546 kodim19.png 549028 485824 482298 kodim20.png 410066 373048 369352 kodim21.png 523610 491994 489686 kodim22.png 613626 523156 512466 kodim23.png 517928 425088 420486 kodim24.png 586314 495318 488506 Total size for JPEG 9 : 12941527 (~12.3 MiB) Total size for WebP 0.2.1 : 11358510 (~10.8 MiB) Total size for patched WebP 0.2.1 : 11245066 (~10.7 MiB) The patch itself: diff --git a/src/dsp/lossless.c b/src/dsp/lossless.c index 62a6b7b..2ea68c0 100644 --- a/src/dsp/lossless.c +++ b/src/dsp/lossless.c @@ -576,8 +576,8 @@ void VP8LSubtractGreenFromBlueAndRed(uint32_t* argb_data, int num_pixs) { for (i = 0; i < num_pixs; ++i) { const uint32_t argb = argb_data[i]; const uint32_t green = (argb >> 8) & 0xff; - const uint32_t new_r = (((argb >> 16) & 0xff) - green) & 0xff; - const uint32_t new_b = ((argb & 0xff) - green) & 0xff; + const uint32_t new_r = (((argb >> 16) & 0xff) - green + 0x80) & 0xff; + const uint32_t new_b = ((argb & 0xff) - green + 0x80) & 0xff; argb_data[i] = (argb & 0xff00ff00) | (new_r << 16) | new_b; } } @@ -595,6 +595,7 @@ static void AddGreenToBlueAndRed(const VP8LTransform* const transform, uint32_t red_blue = (argb & 0x00ff00ffu); red_blue += (green << 16) | green; red_blue &= 0x00ff00ffu; + red_blue ^= 0x00800080u; *data++ = (argb & 0xff00ff00u) | red_blue; } } Looks like WebP could get ~1% better compression if it used JPEG 9 variant of SUBTRACT_GREEN filter. And this seems to be reasonable, because with the addition of CENTERVAL, less red/blue color components are going to be overflowing and wrapping around on average. Too bad that the bitstream format seems to be already freezed for WebP. Or maybe it is still possible to get a similar effect with the other filter types in WebP? Also added Guido Vollbeding (JPEG 9) and Jyrki Alakuijala (WebP) to CC because I think they might be interested in these results or want to provide some feedback. Though I'm not sure not sure if the e-mails can pass through to the list for those who don't have a subscription. Can we retain the superior compatibility and software/hardware support in various operating systems, browsers and devices for the well established legacy JPEG format? And at the same time benefit from the improved compression ratio and/or image quality with the bleeding edge new technologies such as WebP? -- Best regards, Siarhei Siamashka |
From: Siarhei S. <sia...@gm...> - 2013-01-15 11:17:56
|
On Tue, 15 Jan 2013 13:12:01 +0200 Siarhei Siamashka <sia...@gm...> wrote: > On Tue, 15 Jan 2013 08:47:56 +0200 > Siarhei Siamashka <sia...@gm...> wrote: > > > On Mon, 14 Jan 2013 23:53:31 -0600 > > DRC <dco...@us...> wrote: > > > > > Since there have been questions fielded from Fedora and others regarding > > > the potential for libjpeg-turbo to support the DCT scaling and > > > SmartScale features of jpeg-7 and later, I felt compelled to do some > > > research into the effectiveness of these new features. The research > > > revealed that DCT scaling and SmartScale do not generally accomplish > > > anything that can't already be accomplished at least as well (and > > > typically faster) using other means: > > > > > > http://www.libjpeg-turbo.org/About/SmartScale > > > > > > Executive summary: > > > > > > -- For generating lossless files, libpng was much faster (3-4x) and > > > achieved similar compression ratios to jpeg-9. > > > > By the way, there is a paper with this lossless JPEG extension proposal > > here with some compression ratio comparison tables: > > http://jpegclub.org/temp/JPEG_9_Lossless_Coding.doc > > > > But when introducing a new and incompatible format, I think one has to > > actually beat WebP and not PNG nowadays: > > http://blog.chromium.org/2012/08/lossless-and-transparency-modes-in-webp.html > > Actually it is quite interesting that JPEG 9 filter resembles a > very similar SUBTRACT_GREEN filter from WebP: > https://developers.google.com/speed/webp/docs/webp_lossless_bitstream_specification#transformations > The only difference is that libjpeg-9 also adds CENTERVAL (0x80) to > red and green, while WebP doesn't. > > I took 24 images from http://www.r0k.us/graphics/kodak/kodak/ for > testing and tried to hack this tweak into WebP 0.2.1 with the > following results. The numbers after file names are the sizes > when re-encoded with > 1) JPEG 9 (cjpeg –rgb1 –block 1 –arithmetic) > 2) WebP 0.2.1 (cwebp -m 6 -lossless) > 3) patched WebP 0.2.1 to use modified filter from JPEG 9 > > (1) (2) (3) > kodim01.png 586755 504672 503038 > kodim02.png 545688 455562 454684 > kodim03.png 439830 386634 384950 > kodim04.png 556177 460240 456204 > kodim05.png 640895 565768 549250 > kodim06.png 501072 470044 469188 > kodim07.png 483621 419638 416180 > kodim08.png 672957 551580 548236 > kodim09.png 493393 437996 436106 > kodim10.png 511781 445064 440882 > kodim11.png 513196 457606 455696 > kodim12.png 450743 411542 408418 > kodim13.png 656770 607078 594714 > kodim14.png 567437 517324 516012 > kodim15.png 532043 433094 428576 > kodim16.png 443120 424902 423042 > kodim17.png 495714 447794 443550 > kodim18.png 649763 567544 553546 > kodim19.png 549028 485824 482298 > kodim20.png 410066 373048 369352 > kodim21.png 523610 491994 489686 > kodim22.png 613626 523156 512466 > kodim23.png 517928 425088 420486 > kodim24.png 586314 495318 488506 > > Total size for JPEG 9 : 12941527 (~12.3 MiB) > Total size for WebP 0.2.1 : 11358510 (~10.8 MiB) > Total size for patched WebP 0.2.1 : 11245066 (~10.7 MiB) > > The patch itself: > > diff --git a/src/dsp/lossless.c b/src/dsp/lossless.c > index 62a6b7b..2ea68c0 100644 > --- a/src/dsp/lossless.c > +++ b/src/dsp/lossless.c > @@ -576,8 +576,8 @@ void VP8LSubtractGreenFromBlueAndRed(uint32_t* argb_data, int num_pixs) { > for (i = 0; i < num_pixs; ++i) { > const uint32_t argb = argb_data[i]; > const uint32_t green = (argb >> 8) & 0xff; > - const uint32_t new_r = (((argb >> 16) & 0xff) - green) & 0xff; > - const uint32_t new_b = ((argb & 0xff) - green) & 0xff; > + const uint32_t new_r = (((argb >> 16) & 0xff) - green + 0x80) & 0xff; > + const uint32_t new_b = ((argb & 0xff) - green + 0x80) & 0xff; > argb_data[i] = (argb & 0xff00ff00) | (new_r << 16) | new_b; > } > } > @@ -595,6 +595,7 @@ static void AddGreenToBlueAndRed(const VP8LTransform* const transform, > uint32_t red_blue = (argb & 0x00ff00ffu); > red_blue += (green << 16) | green; > red_blue &= 0x00ff00ffu; > + red_blue ^= 0x00800080u; > *data++ = (argb & 0xff00ff00u) | red_blue; > } > } > > Looks like WebP could get ~1% better compression if it used JPEG 9 > variant of SUBTRACT_GREEN filter. And this seems to be reasonable, > because with the addition of CENTERVAL, less red/blue color components > are going to be overflowing and wrapping around on average. Too bad > that the bitstream format seems to be already freezed for WebP. Or > maybe it is still possible to get a similar effect with the other > filter types in WebP? > > Also added Guido Vollbeding (JPEG 9) and Jyrki Alakuijala (WebP) to CC > because I think they might be interested in these results or want to > provide some feedback. Though I'm not sure not sure if the e-mails can > pass through to the list for those who don't have a subscription. > > Can we retain the superior compatibility and software/hardware support > in various operating systems, browsers and devices for the well > established legacy JPEG format? And at the same time benefit from the > improved compression ratio and/or image quality with the bleeding edge > new technologies such as WebP? Sorry, got Jyrki's e-mail wrong, now adding to CC for real. -- Best regards, Siarhei Siamashka |
From: DRC <dco...@us...> - 2013-01-15 20:49:46
|
Interesting. Yes, in all fairness, the Kodak images actually do compress better with jpeg-9 than with any of the PNG modes I tested. I have added them as an additional test case and modified the document accordingly: http://www.libjpeg-turbo.org/About/SmartScale-Lossless > Can we retain the superior compatibility and software/hardware support > in various operating systems, browsers and devices for the well > established legacy JPEG format? And at the same time benefit from the > improved compression ratio and/or image quality with the bleeding edge > new technologies such as WebP? Not sure what you mean by this. libjpeg-turbo is a JPEG library, and technically speaking, SmartScale files are not JPEG files, if you interpret "JPEG" to mean "a file conforming to the JPEG spec." JPEG was never intended as a lossless codec, and I suspect that this is a pretty solid reason why attempting to use it as one either does not perform as well as PNG or does not compress as well as webp. Improving the compression ratio of a SmartScale file would require yet another incompatible format change. On 1/15/13 5:12 AM, Siarhei Siamashka wrote: > Actually it is quite interesting that JPEG 9 filter resembles a > very similar SUBTRACT_GREEN filter from WebP: > https://developers.google.com/speed/webp/docs/webp_lossless_bitstream_specification#transformations > The only difference is that libjpeg-9 also adds CENTERVAL (0x80) to > red and green, while WebP doesn't. > > I took 24 images from http://www.r0k.us/graphics/kodak/kodak/ for > testing and tried to hack this tweak into WebP 0.2.1 with the > following results. The numbers after file names are the sizes > when re-encoded with > 1) JPEG 9 (cjpeg –rgb1 –block 1 –arithmetic) > 2) WebP 0.2.1 (cwebp -m 6 -lossless) > 3) patched WebP 0.2.1 to use modified filter from JPEG 9 > > (1) (2) (3) > kodim01.png 586755 504672 503038 > kodim02.png 545688 455562 454684 > kodim03.png 439830 386634 384950 > kodim04.png 556177 460240 456204 > kodim05.png 640895 565768 549250 > kodim06.png 501072 470044 469188 > kodim07.png 483621 419638 416180 > kodim08.png 672957 551580 548236 > kodim09.png 493393 437996 436106 > kodim10.png 511781 445064 440882 > kodim11.png 513196 457606 455696 > kodim12.png 450743 411542 408418 > kodim13.png 656770 607078 594714 > kodim14.png 567437 517324 516012 > kodim15.png 532043 433094 428576 > kodim16.png 443120 424902 423042 > kodim17.png 495714 447794 443550 > kodim18.png 649763 567544 553546 > kodim19.png 549028 485824 482298 > kodim20.png 410066 373048 369352 > kodim21.png 523610 491994 489686 > kodim22.png 613626 523156 512466 > kodim23.png 517928 425088 420486 > kodim24.png 586314 495318 488506 > > Total size for JPEG 9 : 12941527 (~12.3 MiB) > Total size for WebP 0.2.1 : 11358510 (~10.8 MiB) > Total size for patched WebP 0.2.1 : 11245066 (~10.7 MiB) > > The patch itself: > > diff --git a/src/dsp/lossless.c b/src/dsp/lossless.c > index 62a6b7b..2ea68c0 100644 > --- a/src/dsp/lossless.c > +++ b/src/dsp/lossless.c > @@ -576,8 +576,8 @@ void VP8LSubtractGreenFromBlueAndRed(uint32_t* argb_data, int num_pixs) { > for (i = 0; i < num_pixs; ++i) { > const uint32_t argb = argb_data[i]; > const uint32_t green = (argb >> 8) & 0xff; > - const uint32_t new_r = (((argb >> 16) & 0xff) - green) & 0xff; > - const uint32_t new_b = ((argb & 0xff) - green) & 0xff; > + const uint32_t new_r = (((argb >> 16) & 0xff) - green + 0x80) & 0xff; > + const uint32_t new_b = ((argb & 0xff) - green + 0x80) & 0xff; > argb_data[i] = (argb & 0xff00ff00) | (new_r << 16) | new_b; > } > } > @@ -595,6 +595,7 @@ static void AddGreenToBlueAndRed(const VP8LTransform* const transform, > uint32_t red_blue = (argb & 0x00ff00ffu); > red_blue += (green << 16) | green; > red_blue &= 0x00ff00ffu; > + red_blue ^= 0x00800080u; > *data++ = (argb & 0xff00ff00u) | red_blue; > } > } > > Looks like WebP could get ~1% better compression if it used JPEG 9 > variant of SUBTRACT_GREEN filter. And this seems to be reasonable, > because with the addition of CENTERVAL, less red/blue color components > are going to be overflowing and wrapping around on average. Too bad > that the bitstream format seems to be already freezed for WebP. Or > maybe it is still possible to get a similar effect with the other > filter types in WebP? > > Also added Guido Vollbeding (JPEG 9) and Jyrki Alakuijala (WebP) to CC > because I think they might be interested in these results or want to > provide some feedback. Though I'm not sure not sure if the e-mails can > pass through to the list for those who don't have a subscription. > > Can we retain the superior compatibility and software/hardware support > in various operating systems, browsers and devices for the well > established legacy JPEG format? And at the same time benefit from the > improved compression ratio and/or image quality with the bleeding edge > new technologies such as WebP? > |
From: Siarhei S. <sia...@gm...> - 2013-01-15 20:59:02
|
On Tue, 15 Jan 2013 13:19:01 +0100 Jyrki Alakuijala <jy...@go...> wrote: > On Tue, Jan 15, 2013 at 12:17 PM, Siarhei Siamashka < > sia...@gm...> wrote: > > > On Tue, 15 Jan 2013 13:12:01 +0200 > > Siarhei Siamashka <sia...@gm...> wrote: > > > > > Actually it is quite interesting that JPEG 9 filter resembles a > > > very similar SUBTRACT_GREEN filter from WebP: > > > > > https://developers.google.com/speed/webp/docs/webp_lossless_bitstream_specification#transformations > > > The only difference is that libjpeg-9 also adds CENTERVAL (0x80) to > > > red and green, while WebP doesn't. > > Thank you for this proposal. It looks like this is the best opportunity for > improving WebP lossless that I have seen -- simple, fast and effective. There is not much to be thanked for. I only tried the filter from the Guido's paper. > Before I falsely thought that the most important predictors don't care so > much about the wrap-around, but your testing proves it otherwise. Now that you mentioned it, I also got some doubts about the impact of wrap-around and tried to check something. WebP is not using DCT for lossless compression, right? And for LZ77 or Huffman it really should not affect compression ratio even if we randomly remap all the values for individual bytes (for example using a lookup table). So just out of curiosity, I tried to see if it is actually more like the case of a generic filter, where 0x80 just happens to be a better approximation for the constants than 0x00: R' = R - G + const1 B' = B - G + const2 I checked the following two variants (with the hope that the sample values for the transformed R' might be better compressible if they are adjusted to have about the same average as G samples): 1) R' = R - G + (2 * average_G - average_R) 2) R' = R - G + (2 * median_G - median_R) But both resulted in worse compression. In the end it looks like the wrap-arounds just make the follow-up Predictor Transform less efficient. It does care about the correlations between neighboring pixels, and the abrupt changes in neighboring pixels between the values around 0x00 and the values around 0xFF caused by wrap-arounds are likely affecting compression. > I am not sure that we are willing to change the specification any more, > but this certainly should raise the discussion about it, too. Yes, this is understandable. The format can't be taken seriously before it is fully stabilized. And the last minute changes can surely delay its adoption and raise concerns among users. -- Best regards, Siarhei Siamashka |
From: Siarhei S. <sia...@gm...> - 2013-01-15 22:24:29
|
On Tue, 15 Jan 2013 19:12:57 +0100 "Guido Vollbeding" <vol...@in...> wrote: > Hello Jyrki, Siarhei > > Thank you for the feedback. > Yes, it is quite ironic that I got the first inspiration for the > reversible color transform as used in the new JPEG 9 codec > by investigating the question why WebP outperforms PNG in > lossless mode. > That's also why the name is similar, in jpeglib.h: > > /* Supported color transforms. */ > > typedef enum { > JCT_NONE = 0, > JCT_SUBTRACT_GREEN = 1 > } J_COLOR_TRANSFORM; Thanks, I see. It's a bit unfortunate that you did not check whether your modification to the filter also benefit WebP and not just JPEG 9. Otherwise libwebp-0.2.x might have been a bit better now if you could report this before the WebP bitstream got finalized. > But then I took the turn, looked at JPEG-LS, and finally found > the proper realisation as described in the mentioned "JPEG 9 > Lossless Coding" document. > This is now in the released codec and I think is the most simple > yet effective solution. The way I see it, lossless compression support is also released in WebP (at least months ahead of JPEG 9) and just provides better results at the moment. Do you think JPEG has any realistic chance to catch up? > The results from the example in the document can be found here: > http://jpegclub.org/kodaksuite/, in comparison to > http://r0k.us/graphics/kodak/. > It requires a JPEG 9 capable decoder, which > is actually Jpegcrop from http://jpegclub.org/. One question for clarification. Does JPEG 9 just strictly implement JPEG-LS standard in the new code, which deals with lossless compression? I hope that I found the right link for it: http://www.itu.int/rec/T-REC-T.87-199806-I/en If any JPEG-LS files created by the other encoders really exist in the wild, then being able to *decode* them might be surely useful. Though I'm not so sure about promoting *encoding* to this format. The biggest practical problem is that, for example, going to http://jpegclub.org/kodaksuite/ with both my desktop browser and the browser in my smartphone is not very useful. They just can't decode these files. For most of the users it means that these files are "broken". And this problem just can't be fixed, because in many cases JPEG decoder simply can't be upgraded. A lot of systems out there in the wild are in maintenance-only mode, and may only receive critical security bugfixes at best. Adding extra features is out of question there. JPEG decoders (baseline variant) are also implemented as hardware accelerators in various mobile devices. Hardware is not easily upgradable. Last, but not least, there are also various operating systems that are not open source friendly. You are not doing a favour for a Linux user, who wants, for example, to do some basic photo editing, save it to JPEG format and send to his friend, who happens to only have a baseline JPEG decoder in his system. JPEG-LS had its chance. And it looks like this chance has been wasted long ago. For the casual users, JPEG means just a lossy image format produced by their digital cameras. And they rightfully expect their systems to successfully handle any file with JPEG extension. This can be achieved in two ways: 1) Upgrade every JPEG codec in every device in the world to handle JPEG-LS variant (but this is practically impossible as explained above). 2) Simply don't bring a problem upon yourself and don't create the problematic files. Forget about anything other than the baseline JPEG. And if you really want better compression or something else, there are other alternatives. WebP has a clear advantage today. That is, if it does not waste its chance. I guess we will see in a decade or so whether it is going to be successful or replaced by something else :-) -- Best regards, Siarhei Siamashka |
From: Siarhei S. <sia...@gm...> - 2013-01-16 15:06:08
|
On Wed, 16 Jan 2013 01:13:47 +0100 "Guido Vollbeding" <vol...@in...> wrote: > Hello Siarhei > > Thanks for remarks. > > > Thanks, I see. It's a bit unfortunate that you did not check whether > > your modification to the filter also benefit WebP and not just JPEG 9. > > Otherwise libwebp-0.2.x might have been a bit better now if you > > could report this before the WebP bitstream got finalized. > > I am responsible for and develop substantial technology, and I can't > participate in speculative attempts which are fundamentally flawed, > sorry. > > > The way I see it, lossless compression support is also released in > > WebP (at least months ahead of JPEG 9) and just provides better > > results at the moment. Do you think JPEG has any realistic chance > > to catch up? > > JPEG is real and substantial, WebP is a speculative attempt with > no substance. So WebP is not for me. You may do this, but I won't. > > > One question for clarification. Does JPEG 9 just strictly implement > > JPEG-LS standard in the new code, which deals with lossless > > compression? > > No, JPEG 9 has its own lossless approach. > This is IJG and not affiliated to ISO, so we have nothing more to do > with JPEG-LS than that what is stated in given document: > > Only the part 2 of JPEG-LS ([4]) has a valid specification for > > reversible color transforms, and this is shared by JPEG 9. > > Notice that JPEG 9 does not share anything else with JPEG-LS, though. > > > Thus I can't help you with other details about this, sorry. Thanks a lot for the clarification about the status of JPEG 9 regarding its conformance with the industrial standards. I tried to search a bit and found at least one open source project, which implements the real standard JPEG LS part-2 (ITU T.870/ISO-14495-2): https://github.com/thorfdbg/libjpeg Lossless encoding can be done with: ./jpeg -c -ls 0 -cls inputfile.ppm outputfile.jpg Lossless decoding of these files back to PPM can be done with: ./jpeg -c -cls inputfile.jpg outputfile.ppm Comparing it for encoding http://www.r0k.us/graphics/kodak/kodak/ images with the rest of the lossless codecs mentioned earlier: Total size for JPEG 9 : 12941527 (~12.3 MiB) Total size for WebP 0.2.1 : 11358510 (~10.8 MiB) Total size for patched WebP 0.2.1 : 11245066 (~10.7 MiB) Total size for thorfdbg JPEG-LS : 11195469 (~10.7 MiB) Now I'm really puzzled about the whole purpose of this JPEG 9 exercise. It seems to be inferior to the alternative solutions in every possible way. Why on earth haven't you just implemented the standard JPEG-LS part-2 for lossless encoding in libjpeg, but instead ended up with some sort of NIH junk? > > The biggest practical problem is that, for example, going to > > http://jpegclub.org/kodaksuite/ with both my desktop browser > > and the browser in my smartphone is not very useful. They just > > can't decode these files. For most of the users it means that > > these files are "broken". And this problem just can't be fixed, > > because in many cases JPEG decoder simply can't be upgraded. > > A lot of systems out there in the wild are in maintenance-only > > mode, and may only receive critical security bugfixes at best. > > Adding extra features is out of question there. JPEG decoders > > (baseline variant) are also implemented as hardware accelerators > > in various mobile devices. Hardware is not easily upgradable. > > Last, but not least, there are also various operating systems > > that are not open source friendly. You are not doing a favour > > for a Linux user, who wants, for example, to do some basic photo > > editing, save it to JPEG format and send to his friend, who > > happens to only have a baseline JPEG decoder in his system. > > You are talking nonsense here. IJG has made JPEG popular, and > so IJG is responsible for maintaining it and developing it further. > We cannot care for ignorant people, sorry. > > > JPEG-LS had its chance. And it looks like this chance has been > > wasted long ago. For the casual users, JPEG means just a lossy > > image format produced by their digital cameras. And they > > rightfully expect their systems to successfully handle any file > > with JPEG extension. This can be achieved in two ways: > > 1) Upgrade every JPEG codec in every device in the world to > > handle JPEG-LS variant (but this is practically impossible as > > explained above). > > 2) Simply don't bring a problem upon yourself and don't create > > the problematic files. Forget about anything other than the > > baseline JPEG. And if you really want better compression or > > something else, there are other alternatives. > > You are still talking nonsense here. > We have nothing to do with JPEG-LS and we don't promote it. > This is IJG, we have made JPEG popular, and we are responsible > for its maintenance and further development. > That is what we do, and we are currently at JPEG 9. > We cannot care for ignorant people, sorry. > > > WebP has a clear advantage today. That is, if it does not waste > > its chance. I guess we will see in a decade or so whether it is > > going to be successful or replaced by something else :-) > > WebP is botch. It may be the right thing to deal with for you and > your crowd, but not for me and IJG. > > Kind regards > Guido Vollbeding > Organizer Independent JPEG Group I'm not really sure how to respond and whether any response would contribute to a constructive discussion. Thanks for sharing your opinion. Before leaving, I only want to apologize for posting some bullshit about JPEG-LS without checking the facts, sorry about that. -- Best regards, Siarhei Siamashka |
From: Adam T. <at...@re...> - 2013-01-16 15:34:11
|
On Wed, Jan 16, 2013 at 05:05:54PM +0200, Siarhei Siamashka wrote: > On Wed, 16 Jan 2013 01:13:47 +0100 > "Guido Vollbeding" <vol...@in...> wrote: > > > > The biggest practical problem is that, for example, going to > > > http://jpegclub.org/kodaksuite/ with both my desktop browser > > > and the browser in my smartphone is not very useful. They just > > > can't decode these files. For most of the users it means that > > > these files are "broken". And this problem just can't be fixed, > > > because in many cases JPEG decoder simply can't be upgraded. > > > A lot of systems out there in the wild are in maintenance-only > > > mode, and may only receive critical security bugfixes at best. > > > Adding extra features is out of question there. JPEG decoders > > > (baseline variant) are also implemented as hardware accelerators > > > in various mobile devices. Hardware is not easily upgradable. > > > Last, but not least, there are also various operating systems > > > that are not open source friendly. You are not doing a favour > > > for a Linux user, who wants, for example, to do some basic photo > > > editing, save it to JPEG format and send to his friend, who > > > happens to only have a baseline JPEG decoder in his system. > > > > You are talking nonsense here. IJG has made JPEG popular, and > > so IJG is responsible for maintaining it and developing it further. > > We cannot care for ignorant people, sorry. > > > > > JPEG-LS had its chance. And it looks like this chance has been > > > wasted long ago. For the casual users, JPEG means just a lossy > > > image format produced by their digital cameras. And they > > > rightfully expect their systems to successfully handle any file > > > with JPEG extension. This can be achieved in two ways: > > > 1) Upgrade every JPEG codec in every device in the world to > > > handle JPEG-LS variant (but this is practically impossible as > > > explained above). > > > 2) Simply don't bring a problem upon yourself and don't create > > > the problematic files. Forget about anything other than the > > > baseline JPEG. And if you really want better compression or > > > something else, there are other alternatives. > > > > You are still talking nonsense here. > > We have nothing to do with JPEG-LS and we don't promote it. > > This is IJG, we have made JPEG popular, and we are responsible > > for its maintenance and further development. > > That is what we do, and we are currently at JPEG 9. > > We cannot care for ignorant people, sorry. > > > > > WebP has a clear advantage today. That is, if it does not waste > > > its chance. I guess we will see in a decade or so whether it is > > > going to be successful or replaced by something else :-) > > > > WebP is botch. It may be the right thing to deal with for you and > > your crowd, but not for me and IJG. > > > > Kind regards > > Guido Vollbeding > > Organizer Independent JPEG Group > > I'm not really sure how to respond and whether any response would > contribute to a constructive discussion. Thanks for sharing your > opinion. Before leaving, I only want to apologize for posting some > bullshit about JPEG-LS without checking the facts, sorry about that. Those replies from IJG is enough for me not to include post-jpeg6 API/ABI in Fedora. Argument that "IJG made JPEG popular so IJG is right" has really no influence on me. Regards, Adam -- Adam Tkac, Red Hat, Inc. |
From: DRC <dco...@us...> - 2013-01-17 04:10:08
|
On 1/16/13 9:05 AM, Siarhei Siamashka wrote: > Thanks a lot for the clarification about the status of JPEG 9 > regarding its conformance with the industrial standards. I tried to > search a bit and found at least one open source project, which > implements the real standard JPEG LS part-2 (ITU T.870/ISO-14495-2): > https://github.com/thorfdbg/libjpeg > > Lossless encoding can be done with: > ./jpeg -c -ls 0 -cls inputfile.ppm outputfile.jpg > > Lossless decoding of these files back to PPM can be done with: > ./jpeg -c -cls inputfile.jpg outputfile.ppm > > Comparing it for encoding http://www.r0k.us/graphics/kodak/kodak/ images > with the rest of the lossless codecs mentioned earlier: > > Total size for JPEG 9 : 12941527 (~12.3 MiB) > Total size for WebP 0.2.1 : 11358510 (~10.8 MiB) > Total size for patched WebP 0.2.1 : 11245066 (~10.7 MiB) > Total size for thorfdbg JPEG-LS : 11195469 (~10.7 MiB) Yeah, this is why I released the report on a wiki. :) I've added results for JPEG-LS to my analysis: http://www.libjpeg-turbo.org/About/SmartScale-Lossless I did find one case in which jpeg-9 produced a 12% better compression ratio (but took twice as long to do it) and one case in which it compressed 18% faster (but also 18% worse.) In general, I find no compelling reason to use lossless SmartScale over JPEG-LS. |
From: DRC <dco...@us...> - 2013-01-16 23:22:00
|
On 1/16/13 9:59 AM, Guido Vollbeding wrote: >> Those replies from IJG is enough for me not to include post-jpeg6 API/ABI >> in Fedora. Argument that "IJG made JPEG popular so IJG is right" has >> really >> no influence on me. >> >> Regards, Adam >> >> -- >> Adam Tkac, Red Hat, Inc. > > Thank you for confirming that you prefer to violate license conditions > instead. > You could avoid that by using that "libjpeg" by "thorfdbg" instead of an > illegal > IJG libjpeg clone. But that you don't do. Insane people! > > Kind regards > Guido Vollbeding > Organizer Independent JPEG Group Hi, Guido. You keep saying that, but Tom Lane does not consider us illegal, nor do the multiple legal teams that have reviewed our project (including Fedora Legal.) So could you please elaborate? I am really not trying to start a holy war, here. libjpeg-turbo has never done anything but respond to community demand. However, long before you started accusing us of being "illegal", you were already calling us "ugly stuff done by ignorant and stupid people" (https://bugzilla.redhat.com/show_bug.cgi?id=639672#c7), and you seemed to be mad at us for creating a fork of libjpeg without asking you. And yet, our fork predates jpeg-7 by a few years, so at the time our project started, there was no one to ask but Tom Lane. Personally, I suspect that this whole "illegal" rant is just your latest incarnation of being mad at us for existing. If you were truly interested in resolving whatever issue you have with our code, then all you ever had to do was ask. Instead, with jpeg-8d, you chose to use your README file to accuse us of illegal activity. Do you really think that this sort of thing is winning people over to your cause? Slander actually *is* an illegal activity. Forking an open source project isn't. |
From: DRC <dco...@us...> - 2013-01-17 00:08:32
|
On 1/16/13 9:59 AM, Guido Vollbeding wrote: >> Those replies from IJG is enough for me not to include post-jpeg6 API/ABI >> in Fedora. Argument that "IJG made JPEG popular so IJG is right" has >> really >> no influence on me. >> >> Regards, Adam >> >> -- >> Adam Tkac, Red Hat, Inc. > > Thank you for confirming that you prefer to violate license conditions > instead. > You could avoid that by using that "libjpeg" by "thorfdbg" instead of an > illegal > IJG libjpeg clone. But that you don't do. Insane people! > > Kind regards > Guido Vollbeding > Organizer Independent JPEG Group Not that I believe responding any further is likely to be a fruitful exercise, but I'll do it anyway, because on some level, I hope that perhaps you will process at least a little bit of this and maybe waste less of your life tilting at windmills. Tom Lane made JPEG popular. You built upon his work, and so did we. The IJG's founding principles are to encourage the industry to converge on a common standard JPEG format, not to push out a new non-standard format. By making baseline JPEG much faster, libjpeg-turbo builds upon Tom's original charter, in that we are taking the common standard JPEG format and making it useful to classes of applications that could not previously use JPEG because of lack of performance. Introducing a new format (which SmartScale is) is as much an exercise in politics as it is technology, and even if someone is the smartest person in the world, without the ability and desire to convince others of why their ideas are correct, ultimately the only one who will ever buy into that person's ideas is that person. You seem to take the position that the superiority of SmartScale is self-evident, but it isn't, and the reason that it isn't has nothing to do with any corruption or insanity or stupidity on the part of the rest of the world. It has to do with the fact that there is not sufficient evidence to demonstrate your claims, and whenever someone tries to push for that evidence, you respond with insults and manifestos. Personally, I think that the SmartScale technology is a clever extension of the JPEG format. If this were academia, it would have made good thesis material. However, this isn't academia, and being clever doesn't mean that other people in other projects haven't been equally (or more) clever. And even if SmartScale was in fact superior (which, quantifiably, it doesn't seem to be), that is still no guarantee of success (google "BetaMax.") The truth is, when someone reads messages from you that seem to take the form of "I'm the only one who understands this technology, and the rest of you are stupid or corrupt", and particularly when those messages are in your official, public project documentation, that alone makes people reluctant to adopt your technology. Just because SmartScale builds upon an existing standard doesn't make it inherently better than other formats, and in fact, it makes it worse, because you did not succeed in getting it accepted as an industry standard before you pushed it out via a library that was supposed to be a reference implementation for a common standard JPEG format. You are attempting to use the existing reputation of the IJG and JPEG to make SmartScale look like it has the weight of the JPEG standard behind it, but in fact, it doesn't. From an end user's point of view, it might as well not even be called "JPEG". It is a new format that you created, and without any adoption as an official standard, why would anyone want to use that format? We have no sense of whether it may change arbitrarily at some point in the future. That is all I have to say on the matter. |