RE: [Algorithms] Game loop timings
Brought to you by:
vexxed72
From: Tom P. <ga...@fa...> - 2005-08-23 21:09:13
|
> Can you explain how I take the rendering time and work this in > to the AI update time? I'll try something here. > Currently, I have updates like this for the AI. (all multiples > of 20fps) > 0.05 seconds > 0.05 > 0.05 > 0.10 > 0.05 > 0.05 > etc. > > The 0.10 slips in because gradually the applied timer gets our > of sync with the real amount of time passed and that must be > made up at some point. My take on this, given the behavior as you describe it, is that you just need to separate yourself somewhat from the concept of "real" time. Time is what you decide it is, so there's no harm keeping it at 1/20th of a second if indeed your target speed is 20 fps. As long as all of your updates are consistent, and as long as everything uses the same value, it should be fine. What you want to avoid, and what you're doing, is accumulating error and then cutting out the error whenever it accumulates big enough, despite what the actual frame times might be. Instead, just don't worry about the error. You want the apparent time passed to be pretty consistent with the actual time passed, otherwise you'll get jerky motions. I would disagree with TomF's assertion that variable framerates are easier. I think the fixed per-frame time is significantly easier to code up and put into play. Personally, I prefer variable time if you are truly running at arbitrary framerates, but on the platforms I work on the timestep is always going to be 1/60th of a second or some multiple thereof. When the framerate cuts from 60fps to 30, then all of the timesteps are magically larger. Note that this change may be significant, though, and it may then be desirable to run two 60Hz updates instead of worrying about missed collisions etc. that may start appearing when your timestep doubles. One problem I thought of the other day is that typically I've implemented and used systems where the timestep that is used is the time that it took the last frame to render. Unfortunately, that means if our framerate is really jittery, on a frame-to-frame level, we'll get really jerky motion. E.g. if we literally alternate 1/60 and 1/30 second frames, then we're rendering actions that should be separated by 1/60 of a second 1/30s apart, and also rendering things that should represent 1/30s only 1/60s apart. Is this fear rational? In practice probably not, since our framerate is going to be relatively consistent frame to frame, but still... Hmm... -tom! |