How do you determine up/down latency of a web app.
Posted
by Brodie
on Stack Overflow
See other posts from Stack Overflow
or by Brodie
Published on 2010-05-20T05:30:48Z
Indexed on
2010/05/20
5:40 UTC
Read the original article
Hit count: 236
I am trying to work out how to calculate the latency of requests through a web-app (Javascript) to a .net webservice.
Currently I am essentially trying to sync both client and server time, which when hitting the webservice I can look at the offset (which would accurately show the 'up' latency.
The problem is - when you sync the time's, you have to factor in latency for that also. So currently I am timeing the sync request (round trip) and dividing by 2, in an attempt to get the 'up' latency...and then modify the sync accordingly.
This works on the assumption that latency is symmetrical, which it isn't. Does anyone know a procedure that would be able to determine specifically the up/down latency of a JS http request to a .net service? If it needs to involve multiple handshakes thats fine, what ever is as accurate as possible.
Thanks!!
© Stack Overflow or respective owner