From: Bryce L N. <bno...@fs...> - 2005-05-23 17:14:36
|
geo...@li... wrote on 05/23/2005 02:38:03 AM: > I noticed that this 'test-data' directory contains about 20 Mb of image > data. This lead me to a question: is commiting such an amount of binary > data on the SVN a problem? If it is not, I would not need anymore to > bother peoples with my Maven question ("Maven question: how to bundle > external resource in a JAR?" sent in a mail a few days ago) and could > just commit the 8 Mb HSQL database, in which case the HSQL plugin would > be finished and available tomorrow. I've been avoiding large images for Geotiff tests. Image data is irrelevant for my test cases. I'm not writing an image codec. I'm writing a metadata adaptor. I can sorta see how large files would be necessary to check a homespun codec like ArcGrid (albeit, anything too large just should not be stored in ASCII). However, for the case of BMP/(Geo)TIFF/Jpeg/Jpeg2000, the tests should probably constrain themselves to checking that the CRS is constructed correctly. 20Mb? No wonder it takes forever to check out! Ouch. I don't know as this addresses your question, tho, Martin. Bryce |