Re: [Algorithms] In-place loaded data structures.
Brought to you by:
vexxed72
From: Jamie F. <ja...@qu...> - 2005-11-30 10:29:27
|
Alen Ladavac wrote: >>> Hm, I'm not sure why are you having any problems with this? We're >>> using metadata for almost all content, from small config files to >>> meshes, textures, animations and levels. (Sounds and video files are >>> standard formats - wav, ogg....). Saved file overhead scales with the >>> number of different types and smallest files are <1kb. >> >> >> Well, smallest size for us means including the metadata for all the >> objects. Sure, you could trim some out for limited applications, but >> in practice for any meaningful application you'd be using almost all >> the data types. > > > Yes, that's what I said. The smallest file I was able to found is 960 > bytes, together with all metadata for all objects contained in that > file. That's just a set of parameters for a weapon. > > Perhaps we are talking about a slightly different thing. We do not store > metadata for each object in the file, but only once for each _datatype_ > that has an object in this file. So e.g. if you have a model that has > materials in it, the CMaterial datatype is stored only once. All this > info is stored in the file's header, and the rest of the file is binary > data just as if you were using some fixed-format data file. It's the > header that describes the objects. I think we must be talking about different things. We also only store metadata once for each datatype in Q 1.x, but I'm talking about the smallest file size which contains all the metadata for all the objects that might be stored in the file, not just all the objects that are stored in it, i.e. it's the total size of all metadata for all objects. Typically, that's the overhead you'll have, as you will be using most object types. > >> Even with that tight binding, there's still room to deal with >> endianness, but not class member changes. > > > With the above types-in-the-header approach, there is. > I think you've missed my point; of course you can use a metadata system with versioning. I'm saying it's a bad idea to have tight coupling between the in-memory data class structure and the in-file data class structure that doesn't allow versioning. It looked like that was happening in the specific part of the OP that I was referring to. > >>> For things that have large blobs of data (vertex arrays or textures), >>> the system automatically detects when loading an array of PODTs and >>> just slaps the raw data directly into memory. >> >> >> Absolutely; it's all about having the appropriate primitive data types >> supported in your loading code, and big blob of data has to be one of >> them :) > > > Actually, you don't need to have specific BLOB datatypes. Any "simple > type" (int, float, enum...) is considered "raw" if the files endianness > matches that of the machine that loads the file. Then the "raw" property > propagates up the type hierarchy and you get things like vertex arrays > automatically detected as blobs. This is all done at load time, but is > pretty low overhead. Yes, you could do that. Once you've got to this stage of things, it's pretty easy to pick basic data types that are appropriate for you. A vertex array would be unlikely to be stored as a blob anyway (unless compressed into a binary stream), as the coordinates will likely suffer from endianness issues across platforms. Jamie |