Re: [Algorithms] Complexity of new hardware
Brought to you by:
vexxed72
|
From: Jarkko L. <al...@gm...> - 2009-04-20 08:48:42
|
IMO, a fundamental problem with this is that it's difficult to guarantee that all the data is converted to the new format. If there happens to be some older data out of the reach of the data conversion and you remove the conversion code, the data gets corrupted. Overall, I just think it's an unnecessary maintenance burden for developers to take care of something as mundane as the data conversion. For the final build you obviously have all the data available, thus you can update the data to the most recent version and strip off the type information from the data. You probably want to process the data anyway to linearize it for DVD streaming, swap the endianess, etc. Cheers, Jarkko -----Original Message----- From: Adrian Bentley [mailto:ad...@gm...] Sent: Monday, April 20, 2009 3:08 AM To: Game Development Algorithms Subject: Re: [Algorithms] Complexity of new hardware If you can break off a script by diffing two sets of type metadata is that hard coded? It would be unnecessary maintenance work, unless you want to be able to change your data and can't do it within your framework :P. Maybe there's a way I haven't thought of. Every time I look at C# serialization, I've come up with a number of things I wouldn't be able to do very easily (rename B, break A into 2 pieces, etc.). I certainly hadn't thought about it from a middleware perspective, but all middleware needs a data upgrade path between versions. Otherwise you get stuck in time and become another point in favor of rolling one's own tech. Unless I'm missing something, you can't strip off all the class/type information unless all your data is in the final format. At that point, you've solved your upgrade problem. Linearly allocating _definitely_ helps. For some, though, even running simple serialization code may not be fast enough. Cheers, Adrian |