C# vs C - Big performance difference
Posted
by John
on Stack Overflow
See other posts from Stack Overflow
or by John
Published on 2009-03-26T16:22:40Z
Indexed on
2010/03/24
0:53 UTC
Read the original article
Hit count: 318
I'm finding massive performance differences between similar code in C anc C#.
The C code is:
#include <stdio.h>
#include <time.h>
#include <math.h>
main()
{
int i;
double root;
clock_t start = clock();
for (i = 0 ; i <= 100000000; i++){
root = sqrt(i);
}
printf("Time elapsed: %f\n", ((double)clock() - start) / CLOCKS_PER_SEC);
}
And the C# (console app) is:
using System;
using System.Collections.Generic;
using System.Text;
namespace ConsoleApplication2
{
class Program
{
static void Main(string[] args)
{
DateTime startTime = DateTime.Now;
double root;
for (int i = 0; i <= 100000000; i++)
{
root = Math.Sqrt(i);
}
TimeSpan runTime = DateTime.Now - startTime;
Console.WriteLine("Time elapsed: " + Convert.ToString(runTime.TotalMilliseconds/1000));
}
}
}
With the above code, the C# completes in 0.328125 seconds (release version) and the C takes 11.14 seconds to run.
The c is being compiled to a windows executable using mingw.
I've always been under the assumption that C/C++ were faster or at least comparable to C#.net. What exactly is causing the C to run over 30 times slower?
EDIT: It does appear that the C# optimizer was removing the root as it wasn't being used. I changed the root assignment to root += and printed out the total at the end. I've also compiled the C using cl.exe with the /O2 flag set for max speed.
The results are now: 3.75 seconds for the C 2.61 seconds for the C#
The C is still taking longer, but this is acceptable
© Stack Overflow or respective owner