Hi there
I'm trying to implement a fairly simple algorithm but it's proving difficult
to get right, and I wondered if there is a standard method that I don't know
about.
We're developing an online demo system within our software that allows one
user (the 'shepherd') to take control of a group of other computers (the
'sheep'). It's important that the software remains in synch, otherwise the
actions the shepherd takes may not replicate correctly. So what we need is a
simple one-way synch algorithm.
What we're currently doing is this:
The shepherd tries to run at the standard frame rate. Every frame, it
broadcasts its (attempted) frame rate to the sheep.
If the sheep are within a certain range (5-25 frame behind), they use the
same frame rate as the shepherd.
If they're less than 5 frames behind, they run a bit faster than the
shepherd (if they can)
If they're less than 0 frames behind, they pause
If they're more than 25 frames behind, they try to run as fast as they can,
and they tell the shepherd.
If the shepherd knows that some sheep are lagging behind, it starts to slow
down
Otherwise it starts to speed up
This is working pretty well, but we're getting quite a lot of 'bunjee'
effects (hysteresis, I guess it would be called), where instead of settling
to a reasonably stable speed, it's slowing down and speeding up alternately.
Also when the shepherd is running slower than the sheep, it's proving
difficult to stop the lag getting below 0.
Like I say, I'm sure this is a common problem with a standard algorithm -
anyone know, or have any suggestions about improvements to my existing
system?
Best
Danny
|