What is procedure for stopping robots and malicious scanners that slow down a site?
Posted
by zsharp
on Stack Overflow
See other posts from Stack Overflow
or by zsharp
Published on 2010-03-30T01:03:44Z
Indexed on
2010/03/30
1:13 UTC
Read the original article
Hit count: 317
web-development
What should i do to prevent users from running scanners or auto posting robots against my site that would slow down the site processing?
Is it sufficient to timestamp each post a user makes and create a posting delay? How long of an interval should there be?
What else can I do besides te above and captchas on form posts?
thanks
© Stack Overflow or respective owner