Client side latency when using prediction

Posted by Tips48 on Game Development See other posts from Game Development or by Tips48
Published on 2014-08-22T18:22:30Z Indexed on 2014/08/22 22:35 UTC
Read the original article Hit count: 338

Filed under:
|
|
|
|

I've implemented Client-Side prediction into my game, where when input is received by the client, it first sends it to the server and then acts upon it just as the server will, to reduce the appearance of lag. The problem is, the server is authoritative, so when the server sends back the position of the Entity to the client, it undo's the effect of the interpolation and creates a rubber-banding effect.

For example: Client sends input to server -> Client reacts on input -> Server receives and reacts on input - > Server sends back response -> Client reaction is undone due to latency between server and client

To solve this, I've decided to store the game state and input every tick in the client, and then when I receive a packet from the server, get the game state from when the packet was sent and simulate the game up to the current point.

My questions:

Won't this cause lag? If I'm receiving 20/30 EntityPositionPackets a second, that means I have to run 20-30 simulations of the game state.

How do I sync the client and server tick? Currently, I'm sending the milli-second the packet was sent by the server, but I think it's adding too much complexity instead of just sending the tick. The problem with converting it to sending the tick is that I have no guarantee that the client and server are ticking at the same rate, for example if the client is an old-end PC.

© Game Development or respective owner

Related posts about java

Related posts about networking