"Teach" a computer how to do addition?
Posted
by ffar
on Stack Overflow
See other posts from Stack Overflow
or by ffar
Published on 2009-09-26T12:04:20Z
Indexed on
2010/05/22
23:30 UTC
Read the original article
Hit count: 127
The problem is to learn computer to do addition. Computer have as input a knowladge of a numbers: he "knows" that after 1 goes 2, after 2 goes 3 and so on... Having that data computer can easyly get next number. Next, computer have knowlandge as input that x+0=x and x+(y+1)=(x+1)+y. This axioms let computer to do addition. For example, to add 5 and 3, computer makes following: 5+3 = 5+(2+1) = (5+1)+2 = 6+2 = 6+(1+1) = (6+1)+1 = 7+1 = 8. But this is too long to add numbers in such way. The problem is to develop program which can improve this way of addition using the rules of mathemetics and logic. The goal addition must executed in O(log(N)) time, not O(N) time, N is magnitude of added numbers.
Have this program any science value? Is any program can do such things?
© Stack Overflow or respective owner