How to measure the time taken by C# NetworkStream.Read?
- by publicENEMY
I want to measure time taken for client to receive data over tcp using c#.
Im using NetworkStream.Read to read 100 megabits of data that are sent using NetworkStream.Write. I set the buffer to the same size of data, so there no buffer underrun problem etc. Generally it looks like this.
Stopwatch sw = new Stopwatch();
sw.Start();
stream.Read(bytes, 0, bytes.Length);
sw.Stop();
The problem is, there is a possibility where the sender hasnt actually sent the data but the stopwatch is already running. how can i accurately measure the time taken to receive the data? i did try to use the time lapse of the remote pc stream.Write, but the time it took to write is extremely small.
by the way, is the stopwatch is the most accurate tool for this task?