RE: [Algorithms] Algorithm performance benchmarks and timings
Brought to you by:
vexxed72
From: Jay S. <Ja...@va...> - 2004-04-04 19:48:21
|
> I am looking for 'benchmarks' or criteria on how much=20 > processor power/time an algorithm can take in games. Is there=20 > any meaningful thing to say about that? >=20 I don't think there's very much you can say about that topic in general. It all depends on the tradeoffs in the product/technology that are appropriate. A game targeting 60Hz on a PC has 16.67ms / frame to spend on it's minimum platform, at 30Hz you've got 33.33ms/frame. How much of that do you consume on some specific platform? How important is it relative to the rest of the product? Is the processor consumption constant or does it spike? What's the worst case performance? How likely is it that the usage will spike at the same time other technologies spike? Does the cost scale with simulation time or is it fixed? What's the quality improvement/loss for spending more/less CPU doing this task? What are the alternative algorithms? What other tasks could you do instead? Can you invest in optimization and improve it? How much improvement is likely? What does it cost? Those are the kinds of questions I ask when budgeting CPU time for algorithms. But then how I'd use that information depends directly on what's important to my game. It's just engineering at that point. It's like asking how much programmer time you could spend writing an algorithm. How can you answer that in general? Jay |