How to protect SHTML pages from crawlers/spiders/scrapers?
- by Adam Lynch
I have A LOT of SHTML pages I want to protect from crawlers, spiders & scrapers.
I understand the limitations of SSIs. An implementation of the following can be suggested in conjunction with any technology/technologies you wish:
The idea is that if you request too many pages too fast you're added to a blacklist for 24 hrs and shown a…