Intro
Consider you have a list of key/value pairs:
(0,a) (1,b) (2,c)
You have a function, that inserts a new value between two current pairs, and you need to give it a key that keeps the order:
(0,a) (0.5,z) (1,b) (2,c)
Here the new key was chosen as the average between the average of keys of the bounding pairs.
The problem is, that you list may have milions of inserts. If these inserts are all put close to each other, you may end up with keys such to 2^(-1000000), which are not easily storagable in any standard nor special number class.
The problem
How can you design a system for generating keys that:
Gives the correct result (larger/smaller than) when compared to all the rest of the keys.
Takes up only O(logn) memory (where n is the number of items in the list).
My tries
First I tried different number classes. Like fractions and even polynomium, but I could always find examples where the key size would grow linear with the number of inserts.
Then I thought about saving pointers to a number of other keys, and saving the lower/greater than relationship, but that would always require at least O(sqrt) memory and time for comparison.
Extra info: Ideally the algorithm shouldn't break when pairs are deleted from the list.