Algorithm performance
Posted
by
william007
on Stack Overflow
See other posts from Stack Overflow
or by william007
Published on 2012-11-24T16:39:28Z
Indexed on
2012/11/24
17:04 UTC
Read the original article
Hit count: 177
I am testing an algorithm for different parameters on a computer. I notice the performance fluctuates for each parameters.
Say I run for the first time I got 20 ms, second times I got 5ms, third times I got 4ms: But the algorithm should work the same for these 3 times.
I am using stopwatch
from C# library to count the time, is there a better way to measure the performance without subjecting to those fluctuations?
© Stack Overflow or respective owner