How often do CPUs make calculation errors?

Posted by veryfoolish on Programmers See other posts from Programmers or by veryfoolish
Published on 2011-01-06T03:15:49Z Indexed on 2011/01/06 3:58 UTC
Read the original article Hit count: 333

Filed under:
|

In Dijkstra's Notes on Structured Programming he talks a lot about the provability of computer programs as abstract entities. As a corollary, he remarks how testing isn't enough. E.g., he points out the fact that it would be impossible to test a multiplication function f(x,y) = x*y for any large values of x and y across the entire ranges of x and y. My question concerns his misc. remarks on "lousy hardware". I know the essay was written in the 1970s when computer hardware was less reliable, but computers still aren't perfect, so they must make calculation mistakes sometimes. Does anybody know how often this happens or if there are any statistics on this?

© Programmers or respective owner

Related posts about theory

Related posts about cpu