From: Chen Rifiz [mailto:rifiz200@...]
>It seems that at least I made one mistake here.
>For size of 10000 x 10000 PNG, its size should be (8 bits grayscale is used):
>10^8 bytes = 100 MB approximately.(not huge.....)
Well, no - you could say that 100MByte is huge, however that's the size of your dataset. libpng can read that amount of data, organized in the way you describe, with a lot less memory - probably less than 256kbytes in practice (I was being pessimistic.)
libpng needs to keep two rows of the data in memory while reading - that's an extra 20kbyte over the base amount in your case. My recent tests on my own (x86) system give a base memory requirement of 180kbytes, so that's a total of 200kbytes. In practice allocating 200kbytes of memory will invariably require more address space, but not much more (well, in the worst case I've seen it required twice as much - 400kbytes - but unless you are using a very old educational UNIX system you won't see that.)
>1. read in all rows at once.
>2. process the WHOLE data.
Don't do that. Always read by row unless you can guarantee that the image is small (see below.)
>It is a processing-all scheme. And it can not go back and forth.......
>But if another scheme such as query/process when needed:
That's a database or a hierarchical image - PNG doesn't do either. You can build a database, or a hierarchical structure, and use PNG to compress the sub-elements (the tiles), but PNG isn't designed or intended to help with that problem.
>I'm also curious that how does the other PNG supported programs handle
>big, big, big PNG files and partially show the image data within the visible range??
They don't. PNG isn't used for this. PNG might be used when such data is serialized for transmission, but then at the other end of the transmission it will be decompressed from PNG and converted to a useable format. Conceivably sophisticated display systems might use tiles of image data individually compressed using PNG. Applications storing 2D data that must be compressed without loss (as you describe) might use the same scheme but I'm not aware of any such system.
Microsoft Office stores most non-JPEG bitmap image data in its file format in PNG, but for display it caches an uncompressed screen resolution rendering of the image. It's not exactly the same problem but it illustrates the technique. You might use PNG to store all your data on disk, or even in memory (since the PNG will probably be a lot less than 100MByte) then, on demand, extract tiles of data (say 128x128 points) and keep as many of those as possible to allow random access. You might even compress the 128x128 tile on the fly using PNG to save memory. It may seem strange but if you tweak the compression to use a small zlib window and the lowest zlib compression setting (1, not 0) compression becomes as fast as decompression.
You may well find that simply storing 128x128 tiles of data, PNG compressed, gives a result just as small as compressing the whole dataset as one unit. It is certainly worth performing the experiment.
John Bowler <jbowler@...>