How precise is the internal clock of a modern PC?
- by mafutrct
I know that 10 years ago, typical clock precision equaled a system-tick, which was in the range of 10-30ms. Over the past years, precision was increased in multiple steps. Nowadays, there are ways to measure time intervals in actual nanoseconds. However, usual frameworks still return time with a precision of only around 15ms.
My question is, which steps did increase the precision, how is it possible to measure in nanoseconds, and why are we still often getting less-than-microsecond precision (for instance in .NET).
(Disclaimer: It strikes me as odd that this was not asked before, so I guess I missed this question when I searched. Please close and point me to the question in that case, thanks. I believe this belongs on SO and not on any other SOFU site. I understand the difference between precision and accuracy.)