Re: [Algorithms] In-place loaded data structures.
Brought to you by:
vexxed72
From: Rok E. <ro...@si...> - 2005-12-02 03:47:53
|
>requires a different tool to be authored though, or some manual >effort laying out the disc, or an alternative file handler that >redirects every file read out to a "stream" file or something >to that effect. Yea that's pretty much how we do things (single large file that is treated as a "stream" most of the time). >> Start seeking the head, and you're finished. ;) > > Which in particular goes for one recent handheld platform. I've actually been considering pregenerating "random" sets of enemy cars and store then on disc as continous batches since seek times are such a pain there... >If >you want to stream data, if you want to have memory changing >while you play the game, you really should (IMO) investigate >means to allow you to do this stuff non-deterministically. >Fragmentation /will/ kill you if you allow it to occur. > On that I completely agree, which is one reason I've found this thread particularly interesting read. It's also something that I've been investigating for the future projects. >Then that Crash Bandicoot game came out in the first batch of >games for the then-new PS2, and what did it have, 60-second >load times? Holy hell. > > Heh, I originally realized this one on our first CD test - when the level took 10 minutes to load ;) Which is one anecdote I keep telling people when trying to explain WHY physical drive transfer speeds have very little with loading issues in most games. In final we got the load time down to ~7seconds, and that loads roughly 2x the raw data that original 10minutes did ;) (and it's compressed with LZW, which original data also wasn't). |