libpng-1.6.32 attempts to calculate the maximum reasonable size for an IDAT chunk in pngrutil.c:png_check_chunk_length(), but it seems to assume the data has been generated by zlib or some other "reasonable" compressor which outputs data with minimal overhead. However, the PNG standard does not (as far as I can tell) include any such requirement, and indeed, https://www.w3.org/TR/PNG/ section 15.2.3(k) states that a conformant decoder must "assume[] no more than that the complete image data is represented by a single compressed datastream that is stored in some number of IDAT chunks".
An example failure case is an image in which each row of the image is stored as a separate "deflate" uncompressed block (see attached file).
I understand the sense in reporting unexpectedly large IDAT chunks, but as long as the size doesn't violate user-specified chunk size limits, this should probably be a warning rather than an error.
I'm not seeing a failure to read the attached image with libpng-1.6.33beta01 or 1.6.32betq12. I do see your point, though. I suppose we could use a more generous
upper limit of one deflate block per sample.
It looks like SourceForge re-encoded the attachment. I've unfortunately already deleted my copy, but I can regenerate it if you need it.
Yes please regenerate the PNG and upload it as idat_too_large.png.tar.gz so SF doesn't re-encode it.
Last edit: Glenn Randers-Pehrson 2017-08-29
Attached.
I have attached another image that reproduces the bug.
Fixed in git branch libpng16, please test. It sets a larger limit for IDAT, allowing for a deflate buffer per row. Also, it uses the PNG_USER_CHUNK_MALLOC_MAX user limit, so users can set an even larger limit if they need it.
git eb2f42a does indeed fix the problem for me. Thanks for the fix.
This bug is fixed in all branches and "libpng rc01" releases. Public releases are planned for 28 September 2017.