Big-O for calculating all routes from GPS data
Posted
by HH
on Stack Overflow
See other posts from Stack Overflow
or by HH
Published on 2010-05-25T09:49:21Z
Indexed on
2010/05/25
10:51 UTC
Read the original article
Hit count: 244
A non-critical GPS module use lists because it needs to be modifiable, new routes added, new distances calculated, continuos comparisons. Well so I thought but my team member wrote something I am very hard to get into.
His pseudo code
int k =0;
a[][] <- create mapModuleNearbyDotList -array //CPU O(n)
for(j = 1 to n) // O(nlog(m))
for(i =1 to n)
for(k = 1 to n)
if(dot is nearby)
adj[i][j]=min(adj[i][j], adj[i][k] + adj[k][j]);
His ideas
- transformations of lists to tables
- His worst case time complexity is
O(n^3)
, wheren
is number of elements in hisso-called table
. - Exception to the last point with Finite structure:
O(mlog(n))
where n is number of vertices and m is the amount of neighbour vertices.
Questions about his ideas
- why to waste resources to transform constantly-modified lists to table? Fast?
- only point where I to some extent agree but cannot understand the same upper limits n for each for-loops -- perhaps he supposed it circular
- why does the code take
O(mlog(n))
to proceed in time asfinite
structure? The termfinite
may be wrong, explicit?
© Stack Overflow or respective owner