Be able to affect libpng user_chunk_malloc_max
Swiss army knife of image processing
Brought to you by:
bfriesen
Hello, I've run into PNG images with very large (~24Mb) chunks with xmp metadata. Currently the default limit in libpng is 8MB so graphicsmagick fails.
Would it be possible to make graphicsmagick to call png_set_chunk_malloc_max somehow? (maybe also png_set_chunk_cache_max?)
Something like this patch:
diff --git a/coders/png.c b/coders/png.c
index ba2ea76..ce42b5d 100644
--- a/coders/png.c
+++ b/coders/png.c
@@ -1637,6 +1637,9 @@ static Image *ReadOnePNGImage(MngInfo *mng_info,
png_set_user_limits(ping,
(png_uint_32) Min(0x7fffffffL, GetMagickResourceLimit(WidthResource)),
(png_uint_32) Min(0x7fffffffL, GetMagickResourceLimit(HeightResource)));
+# if PNG_LIBPNG_VER >= 10401 /* png_set_chunk_malloc_max was added in 1.4.1 */
+ png_set_chunk_malloc_max(ping, GetMagickResourceLimit(MemoryResource));
+# endif
#endif /* PNG_SET_USER_LIMITS_SUPPORTED */
(void) LogMagickEvent(CoderEvent,GetMagickModule(),
Not sure if it make sense to use GetMagickResourceLimit(MemoryResource) as limit?
On Wed, 20 Feb 2019, Mattias Wadman wrote:
It would be an extraordinarily high limit and a huge jump from the
default, impacting all users. Intentionally compromised PNG files
could take advantage of the weakness.
Most format-specific options are supported via the -define mechanism
so that they can be carefully targeted.
Bob
Yes i agree. Do you want me to look into adding such define?
On Wed, 20 Feb 2019, Mattias Wadman wrote:
Well tested patches are always appreciated.
A recent example pertaining to the PNG coder is
-define mng:maximum-loops=<value></value>
The PNG coder supports PNG, JNG, and MNG so it would need to be
decided if all of these should support the same feature and if the
same feature should appear with png:, jng:, and mng: prefixes. It is
possible for a definition starting with png: to be used by JNG and MNG
since usage is arbitary.
Bob
A similar issue with ImageMagick https://github.com/ImageMagick/ImageMagick/issues/1301 maybe useful
Last edit: Mattias Wadman 2019-02-20
I downloaded the PNG file pertaining to the ImageMagick issue 1301. I see that it has a raw profile which decompresses to 117,048,920 bytes! This is very convincing that it would be wrong to use your patch directly because then it would be very easy to attack using highly-compressed PNG files.
The error message produced by libpng is not helpful since it does not indicated its maximum chunk size or the size that the user would need to specify in order to successfully decode a file.
It appears that if the libpng tunable PNG_USER_CHUNK_MALLOC_MAX is set when libpng is built, that this would allow creating a libpng with higher limits. In a libpng I am looking at, the limit seems to be set to 8000000 (8MB).
Mercurial changeset 16601:083c790c4850 provides support for a png:chunk-malloc-max=limit define in order to adjust libpng's chunk size limit.
Don't remember now if we ended up modifying PNG_USER_CHUNK_MALLOC_MAX or call png_set_chunk_malloc_max in our patched build.
Thanks for adding an option!