From: Damian M. <da...@er...> - 2001-12-23 08:44:58
|
I need to increase the speed of loading big, multi-megabyte VRML files. They are of the order of 5MB. These then include another 30MB of related VRML files, URLs related to textures, and the like. To this end, I proposed to change the size of the file buffers used by the classes Doc and Doc2. Correct me if whatever follows is garbage. I think that a Doc is fairly easy. I have added a extra entry to the class which is an internal buffer and when a file is opened with ::fopen I simply allocate a lot of space and, like any STDIO file, use setvbuf. I'll make it easy to change this size at a later stage. Doc2 is a little more difficult. It uses 'streambuf' and 'istream' but then replaces their open functions and effectively. In 'filebuf::open', it simply calls the global gzopen out of 'ZLIB', i.e. gzFile gzopen(const char *path, const char *mode); Then with the STDIO file, this->file we new up some more buffer space, assign this space to the FILE with another setvbuf and away we go. This looks too simple. I might be getting in the road of other logic. And have I missed something? Does it go elsewhere in the code if the file is not gzipped? In the past, I have generally done my own big buffering and stayed away from IOSTREAMs whenever performance was an issue so my experience in this area is a little sparse. Thanks - Damian (McGuckin) Pacific Engineering Systems International, 22/8 Campbell St, Artarmon NSW 2064 Ph:+61-2-99063377 .. Fx:+61-2-99063468 | unsolicited email not wanted here ! Views and opinions here are mine and not those of any past or present employer |