i made a class which helps me measure time for any methods in Ticks.
Basically, it runs testing method 100x, and force GC, then it records time taken for another 100x method runs. x64 release ctrl+f5 VS2012/VS2010
the results are following:
2,914 2,909 2,913 2,909 2,908
2,907 2,909 2,998 2,976 2,855
2,446 2,415 2,435 2,401 2,402
2,402 2,399 2,401 2,401 2,400
2,399 2,400 2,404 2,402 2,401
2,399 2,400 2,402 2,404 2,403
2,401 2,403 2,401 2,400 2,399
2,414 2,405 2,401 2,407 2,399
2,401 2,402 2,401 2,404 2,401
2,404 2,405 2,368 1,577 1,579
1,626 1,578 1,576 1,578 1,577
1,577 1,576 1,578 1,576 1,578
1,577 1,578 1,576 1,578 1,577
1,579 1,585 1,576 1,579 1,577
1,579 1,578 1,579 1,577 1,578
1,577 1,578 1,576 1,578 1,577
1,578 1,599 1,579 1,578 1,582
1,576 1,578 1,576 1,579 1,577
1,578 1,577 1,591 1,577 1,578
1,578 1,576 1,578 1,576 1,578
As you can see there are 3 phases, first is ~2,900, second is ~2,400, then ~1,550
What might be the reason to cause it?
the test performance class code follows:
public static void RunTests(Func<long> myTest)
{
const int numTrials = 100;
Stopwatch sw = new Stopwatch();
double[] sample = new double[numTrials];
Console.WriteLine("Checksum is {0:N0}", myTest());
sw.Start();
myTest();
sw.Stop();
Console.WriteLine("Estimated time per test is {0:N0} ticks\n", sw.ElapsedTicks);
for (int i = 0; i < numTrials; i++)
{
myTest();
}
GC.Collect();
string testName = myTest.Method.Name;
Console.WriteLine("----> Starting benchmark {0}\n", myTest.Method.Name);
for (int i = 0; i < numTrials; i++)
{
sw.Restart();
myTest();
sw.Stop();
sample[i] = sw.ElapsedTicks;
}
double testResult = DataSetAnalysis.Report(sample);
for (int j = 0; j < numTrials; j = j + 5)
Console.WriteLine("{0,8:N0} {1,8:N0} {2,8:N0} {3,8:N0} {4,8:N0}", sample[j], sample[j + 1], sample[j + 2], sample[j + 3], sample[j + 4]);
Console.WriteLine("\n----> End of benchmark");
}