mod evasive not working properly on ubuntu 10.04

Posted by Joe Hopfgartner on Server Fault See other posts from Server Fault or by Joe Hopfgartner
Published on 2012-09-01T18:40:02Z Indexed on 2012/09/01 21:40 UTC
Read the original article Hit count: 499

Filed under:
|
|

I have an ubuntu 10.04 server where I installed mod_evasive using apt-get install libapache2-mod-evasive

I already tried several configurations, the result stays the same.

The blocking does work, but randomly.

I tried with low limis and long blocking periods as well as short limits.

The behaviour I expect is that I can request websites until either page or site limit is reached per given interval. After that I expect to be blocked until I did not make another request for as long as the block period.

However the behaviour is that I can request sites and after a while I get random 403 blocks, which increase and decrase in percentage, however they are very scattered.

This is an output of siege, so you get an idea:

HTTP/1.1 200   0.09 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.11 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.09 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.09 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.09 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.10 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.08 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.09 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.10 secs:      75 bytes ==> /robots.txt
HTTP/1.1 403   0.09 secs:     242 bytes ==> /robots.txt
HTTP/1.1 200   0.09 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.09 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.10 secs:      75 bytes ==> /robots.txt
HTTP/1.1 200   0.08 secs:      75 bytes ==> /robots.txt

The exac limits in place during this test run were:

DOSHashTableSize 3097
DOSPageCount 10
DOSSiteCount 100
DOSPageInterval 10
DOSSiteInterval 10
DOSBlockingPeriod 120
DOSLogDir /var/log/mod_evasive
DOSEmailNotify ***@gmail.com
DOSWhitelist 127.0.0.1

So I would expect to be blocked at least 120 seconds after being blocked once.

Any ideas aobut this?

I also tried adding my configuration at different places (vhost, server config, directory context) and with of without ifmodule directive...

This doesnt change anything.

© Server Fault or respective owner

Related posts about apache2

Related posts about ddos