Iteration speed of int vs long
- by jqno
I have the following two programs:
long startTime = System.currentTimeMillis();
for (int i = 0; i < N; i++);
long endTime = System.currentTimeMillis();
System.out.println("Elapsed time: " + (endTime - startTime) + " msecs");
and
long startTime = System.currentTimeMillis();
for (long i = 0; i < N; i++);
long endTime = System.currentTimeMillis();
System.out.println("Elapsed time: " + (endTime - startTime) + " msecs");
Note: the only difference is the type of the loop variable (int and long).
When I run this, the first program consistently prints between 0 and 16 msecs, regardless of the value of N. The second takes a lot longer. For N == Integer.MAX_VALUE, it runs in about 1800 msecs on my machine. The run time appears to be more or less linear in N.
So why is this?
I suppose the JIT-compiler optimizes the int loop to death. And for good reason, because obviously it doesn't do anything. But why doesn't it do so for the long loop as well?
A colleague thought we might be measuring the JIT compiler doing its work in the long loop, but since the run time seems to be linear in N, this probably isn't the case.