Algorithm performance
- by william007
I am testing an algorithm for different parameters on a computer.
I notice the performance fluctuates for each parameters.
Say I run for the first time I got 20 ms, second times I got 5ms, third times I got 4ms:
But the algorithm should work the same for these 3 times.
I am using stopwatch from C# library to count the time, is there a better way to measure the performance without subjecting to those fluctuations?