single for-loop runtime explanation problem

Posted by owwyess on Programmers See other posts from Programmers or by owwyess
Published on 2014-08-18T12:32:26Z Indexed on 2014/08/18 16:43 UTC
Read the original article Hit count: 395

Filed under:
|
|
|

I am analyzing some running times of different for-loops, and as I'm getting more knowledge, I'm curious to understand this problem which I have still yet to find out. I have this exercise called "How many stars are printed":

for (int i = N; i > 1; i = i/2) System.out.println("*");

The answers to pick from is A: ~log N B: ~N C: ~N log N D: ~0.5N^2

So the answer should be A and I agree to that, but on the other side.. Let's say N = 500 what would Log N then be? It would be 2.7. So what if we say that N=500 on our exercise above? That would most definitely print more han 2.7 stars? How is that related?

Because it makes sense to say that if the for-loop looked like this:

for (int i = 0; i < N; i++)

it would print N stars.

I hope to find an explanation for this here, maybe I'm interpreting all these things wrong and thinking about it in a bad way. Thanks in advance.

© Programmers or respective owner

Related posts about algorithms

Related posts about complexity