I'd like to compress pdf's but only by removing duplicate resources (fonts+images) and not by re-compressing already compressed content because the process is time critical.
In a test a 8MB PDF with many duplicate fonts and images was reduced to 4MB but it took 33sec on a decent machine. I suppose that's because all compressed objects are de-comressed and re-compressed with Flate. My assumption is that only dropping duplicate resources could be done in 5-10 seconds - which would be acceptable for our case.
I didn't find anything in the documentation or the forum on that.
That's a good idea. It shouldn't be difficult just to prevent uncompression and recompression, and see how much it affects the size of the result and run time. I'll put it on my queue.
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.