Multiplayer Network Game - Interpolation and Frame Rate
- by J.C.
Consider the following scenario:
Let's say, for sake of example and simplicity, that you have an authoritative game server that sends state to its clients every 45ms. The clients are interpolating state with an interpolation delay of 100 ms. Finally, the clients are rendering a new frame every 15ms.
When state is updated on the client, the client time is set from the incoming state update. Each time a frame renders, we take the render time (client time - interpolation delay) and identify a previous and target state to interpolate from. To calculate the interpolation amount/factor, we take the difference of the render time and previous state time and divide by the difference of the target state and previous state times:
var factor = ((renderTime - previousStateTime) / (targetStateTime - previousStateTime))
Problem: In the example above, we are effectively displaying the same interpolated state for 3 frames before we collected the next server update and a new client (render) time is set.
The rendering is mostly smooth, but there is a dash of jaggedness to it.
Question: Given the example above, I'd like to think that the interpolation amount/factor should increase with each frame render to smooth out the movement. Should this be considered and, if so, what is the best way to achieve this given the information from above?