Why is it an issue that it takes 2 digits to represent the number 10 in decimal?

Posted by Crizly on Programmers See other posts from Programmers or by Crizly
Published on 2014-05-27T16:39:38Z Indexed on 2014/05/27 22:11 UTC
Read the original article Hit count: 194

Filed under:

So we use hexadecimal which has the advantage of going up to 15 in single digits A-F, but why is it an issue that it takes 2 digits to represent the number 10 in decimal?

I was reading up about hexadecimal and I came across these 2 lines:

Base 16 suggests the digits 0 to 15, but the problem we face is that it requires 2 digits to represent 10 to 15. Hexadecimal solves this problem by using the letters A to F.

My question is, why do we care how many digits it takes to represent 15? What is the importance of this number, and how does denoting it with a single character have any value?

© Programmers or respective owner

Related posts about numbers