Why don't computers store decimal numbers as a second whole number?
- by SomeKittens
Computers have trouble storing fractional numbers where the denominator is something other than a solution to 2^x. This is because the first digit after the decimal is worth 1/2, the second 1/4 (or 1/(2^1) and 1/(2^2)) etc.
Why deal with all sorts of rounding errors when the computer could have just stored the decimal part of the number as another whole number (which is therefore accurate?)
The only thing I can think of is dealing with repeating decimals (in base 10), but there could have been an edge solution to that (like we currently have with infinity).