How to measure the time taken by C# NetworkStream.Read?

Posted by publicENEMY on Stack Overflow See other posts from Stack Overflow or by publicENEMY
Published on 2011-01-03T04:36:21Z Indexed on 2011/01/03 4:53 UTC
Read the original article Hit count: 167

I want to measure time taken for client to receive data over tcp using c#.

Im using NetworkStream.Read to read 100 megabits of data that are sent using NetworkStream.Write. I set the buffer to the same size of data, so there no buffer underrun problem etc. Generally it looks like this.

Stopwatch sw = new Stopwatch();
sw.Start();
stream.Read(bytes, 0, bytes.Length);
sw.Stop();

The problem is, there is a possibility where the sender hasnt actually sent the data but the stopwatch is already running. how can i accurately measure the time taken to receive the data? i did try to use the time lapse of the remote pc stream.Write, but the time it took to write is extremely small. by the way, is the stopwatch is the most accurate tool for this task?

© Stack Overflow or respective owner

Related posts about c#

Related posts about bandwidth