I've stumbled on a problem these days while programming a small simulation. I've read a few articles with a main title of: "Fix you're time step" and concluded on the following loop:
 
0. numTotalUpdates = 0;
1. fixedDt = 1.0f / 400.0f; // how small is enough?
2. while (IsRunning())
{
3.     DispatchEvents();
 
4.     static float accumulator = 0.0f;
5.     numUpdatesThisFrame = 0;
6.     accumulator += GetFrameTime() ; // time since last update
 
7.     while (accumulator >= fixedDt)
        {
8.           UpdateSimulation(fixedDt);
9.           accumulator -= accumulator;  
10.         numUpdatesThisFrame ++;       
        }
11.    numTotalUpdates++;
12.   renderInterpolator = accumulator / fixedDt; // safe to say it's in 0..1
13.   Render(renderInterpolator);
}
 
As far as i know this is physics engine friendly main loop because physics engine deal great with small fixed time steps when approximating integration.
As i was saying i was running a simple simulation of a circle that was hitting obstacles and was being reflected as expected. Actually there was more going on, on the screen, but the main idea of the simulation is of a circle in a 2D environment colliding in a breakout game fashion.
I repeatedly run the simulation with VSync on and observed an annoying stuttering/choppy movement of the circle. The stuttering is almost random like. After some profiling i found out that numUpdatesThisFrame was varying kind of weird: it was growing from 6,7 to 15 or even 24. I changed this line to this line
 
6'. accumulator = some_constant;
 
and the ran the simulation again with VSync on. The stuttering was gone and since the value of some_constant was tweaked for my processor speed the motion was neither to slow and neither two fast.
 
So something was happening that caused big times between frames which implied an unnaturally big numUpdatesThisFrame value. The framerate was very high though, without VSync on i was getting a few thousands frames per second so i thought to myself that bothering with why does sometime the time beween frames was unnaturally big, was not the way to solve the problem.
 
I thought the way to tackle this issue is to compensate for the random big times between is to write additional code in the main loop. I never dealt with the problem of fixed time steps and varying framerate and i can't really know how to start.
 
One way i thought was to first detect when a big shift occurs and then try to correct it.
The numbers of updates per frame form a series of intger numbers. The series is unlikely to be convergent in the pure mathematical point of view but i think it's intuitive to observe that its values will get closer to some fixed values. I can calculate an approximation of this value by an arithemetic mean like so:
 
approximateConvergenceValue = Sum(sequence[i]) / numTotalUpdates, 1<=i<=numTotalUpdates;
 
A better approximation of this value would be to use a quadratic mean like so:
 
approximateConvergenceValue = Square_Root(Sum(sequence[i]*sequence[i]) / numTotalUpdates), 1<=i<=numTotalUpdates;
 
This is pretty close to the most frequent values of numUpdatesThisFrame so i guess it's a good approximation. I'm puzzled when applying this detection to correct the framerate and also keep approximateConvergenceValue to a good approximation. The simulation would ideally self adapt to work the same using this calculation(minimizing the effect seen on screen as a result of the adaption)
 
So how does one usually deal with fixed time step loops and varying frame rate?