Looking for a clear and concise web page explaining why lower bits of random numbers are usually not
- by Hamish Grubijan
I am putting together an internal "every developer should know" wiki page.
I saw many discussions regarding rand() % N, but not a single web page that explains it all.
For instance, I am curious if this problem is only C- and Linux-specific, or if it also applies to Windows, C++,. Java, .Net, Python, Perl.
Please help me get to the bottom of this. Also, just how non-random do the numbers get? Thank you!