average case running time of linear search algorithm
Posted
by
Brahadeesh
on Stack Overflow
See other posts from Stack Overflow
or by Brahadeesh
Published on 2011-02-26T06:53:35Z
Indexed on
2011/02/26
15:24 UTC
Read the original article
Hit count: 388
Hi all. I am trying to derive the average case running time for deterministic linear search algorithm. The algorithm searches an element x in an unsorted array A in the order A[1], A[2], A[3]...A[n]. It stops when it finds the element x or proceeds until it reaches the end of the array. I searched on wikipedia and the answer given was (n+1)/(k+1) where k is the number of times x is present in the array. I approached in another way and am getting a different answer. Can anyone please give me the correct proof and also let me know whats wrong with my method?
E(T)= 1*P(1) + 2*P(2) + 3*P(3) ....+ n*P(n) where P(i) is the probability that
the algorithm runs for 'i' time (i.e. compares 'i' elements).
P(i)= (n-i)C(k-1) * (n-k)! / n!
Here, (n-i)C(k-1) is (n-i) Choose (k-1). As the algorithm has reached the ith
step, the rest of k-1 x's must be in the last n-i elements. Hence (n-i)C(k-i).
(n-k)! is the total number of ways of arranging the rest non x numbers, and n!
is the total number of ways of arranging the n elements in the array.
I am not getting (n+1)/(k+1) on simplifying.
© Stack Overflow or respective owner