What is faster- Java or C# (Or good old C)?
- by Rexsung
I'm currently deciding on a platform to build a scientific computational product on, and am deciding on either C#, Java, or plain C with Intels compiler on Core2 Quad CPU's. It's mostly integer arithmetic.
My benchmarks so far show Java and C are about on par with each other, and dotNET/C# trails by about 5%- however a number of my coworkers are claiming that dotNET with the right optimizations will beat both of these given enough time for the JIT to do its work.
I always assume that the JIT would have done it's job within a few minutes of the app starting (Probably a few seconds in my case, as it's mostly tight loops), so I'm not sure whether to believe them
Can anyone shed any light on the situation? Would dotNET beat Java? (Or am I best just sticking with C at this point?).
The code is highly multithreaded and data sets are several terabytes in size.
Haskell/erlang etc are not options in this case as there is a significant quantity of existing legacy C code that will be ported to the new system, and porting C to Java/C# is a lot simpler than to Haskell or Erlang. (Unless of course these provide a significant speedup).
Edit: We are considering moving to C# or Java because they may, in theory, be faster. Every percent we can shave off our processing time saves us tens of thousands of dollars per year. At this point we are just trying to evaluate whether C, Java, or c# would be faster.