Big-O for calculating all routes from GPS data

Posted by HH on Stack Overflow See other posts from Stack Overflow or by HH
Published on 2010-05-25T09:49:21Z Indexed on 2010/05/25 10:51 UTC
Read the original article Hit count: 244

Filed under:
|
|

A non-critical GPS module use lists because it needs to be modifiable, new routes added, new distances calculated, continuos comparisons. Well so I thought but my team member wrote something I am very hard to get into.

His pseudo code

int k =0;
a[][] <- create mapModuleNearbyDotList -array //CPU O(n)
for(j = 1 to n) // O(nlog(m))
   for(i =1 to n)
      for(k = 1 to n)
         if(dot is nearby)
             adj[i][j]=min(adj[i][j], adj[i][k] + adj[k][j]);

His ideas

  1. transformations of lists to tables
  2. His worst case time complexity is O(n^3), where n is number of elements in his so-called table.
  3. Exception to the last point with Finite structure: O(mlog(n)) where n is number of vertices and m is the amount of neighbour vertices.

Questions about his ideas

  1. why to waste resources to transform constantly-modified lists to table? Fast?
  2. only point where I to some extent agree but cannot understand the same upper limits n for each for-loops -- perhaps he supposed it circular
  3. why does the code take O(mlog(n)) to proceed in time as finite structure? The term finite may be wrong, explicit?

© Stack Overflow or respective owner

Related posts about gps

Related posts about big-o