From: Andrea A. <aa...@op...> - 2009-01-08 17:07:49
|
Jody Garnett ha scritto: ... > The directory datastore was only intended to reduce the configuration > pain for GeoServer. You may find > it easier / more effective to produce a "crawler" that can go through a > directory and add any found entries > to the geoserver catalog. Hum, I actually believe it makes sense as a data store, as it fits the concept. I remember when I was working at municipality of Modena, people asking other people to pass them a project... and the project was really just a directory filled with shapefiles and imagery. From a file based point of view it makes sense to me, what is in the same folder has been considered somewhat related by the user, just as (geo-)tables in a database. >> Issues: >> - a certain datastore can open multiple files (shapefile, property ds), >> we want to avoid keeping duplicate datastores around >> > (Do we have an example of this?) >> - a directory (or worse, a tree) can hold a massive amount of feature >> types, there are legitimate scalability/memory consumption concerns. >> > Can the actual DataStores be created in a lazy fashion? I am more > worried about DataStores being aggressive and > loading things into memory on creation ... if DirectoryDataStore can > drive requirements in this area it would be a good thing. Hum, no, I have no idea of how to handle this in a lazy way. I thought I could, but thinking deeper about this, I don't know. Generally speaking, there is no guaranteed relationship between file names and feature type names, so in order to know which feature type names are around, I need to open all the datastores. The above is not needed only to respond to getTypeNames, but also for the more common getFatureSource(typeName), as I have to know if the feature type is around, and which datastore to use to actually grab the feature source. Cheers Andrea -- Andrea Aime OpenGeo - http://opengeo.org Expert service straight from the developers. |