How precise is the internal clock of a modern PC?

Posted by mafutrct on Stack Overflow See other posts from Stack Overflow or by mafutrct
Published on 2010-04-09T12:16:54Z Indexed on 2010/04/09 12:43 UTC
Read the original article Hit count: 311

Filed under:
|
|

I know that 10 years ago, typical clock precision equaled a system-tick, which was in the range of 10-30ms. Over the past years, precision was increased in multiple steps. Nowadays, there are ways to measure time intervals in actual nanoseconds. However, usual frameworks still return time with a precision of only around 15ms.

My question is, which steps did increase the precision, how is it possible to measure in nanoseconds, and why are we still often getting less-than-microsecond precision (for instance in .NET).

(Disclaimer: It strikes me as odd that this was not asked before, so I guess I missed this question when I searched. Please close and point me to the question in that case, thanks. I believe this belongs on SO and not on any other SOFU site. I understand the difference between precision and accuracy.)

© Stack Overflow or respective owner

Related posts about datetime

Related posts about time-precision