I'm trying to implement Glenn Fiedler's popular fixed timestep system as documented here:
http://gafferongames.com/game-physics/fix-your-timestep/
In Flash. I'm fairly sure that I've got it set up correctly, along with state interpolation.
The result is that if my character is supposed to move at 6 pixels per frame, 35 frames per second = 210 pixels a second, it does exactly that, even if the framerate climbs or falls.
The problem is it looks awful. The movement is very stuttery and just doesn't look good.
I find that the amount of time in between ENTER_FRAME events, which I'm adding on to my accumulator, averages out to 28.5ms (1000/35) just as it should, but individual frame times vary wildly, sometimes an ENTER_FRAME event will come 16ms after the last, sometimes 42ms.
This means that at each graphical redraw the character graphic moves by a different amount, because a different amount of time has passed since the last draw. In theory it should look smooth, but it doesn't at all. In contrast, if I just use the ultra simple system of moving the character 6px every frame, it looks completely smooth, even with these large variances in frame times.
How can this be possible? I'm using getTimer() to measure these time differences, are they even reliable?