robots.txt file with more restrictive rules for certain user agents
- by Carson63000
Hi,
I'm a bit vague on the precise syntax of robots.txt, but what I'm trying to achieve is:
Tell all user agents not to crawl certain pages
Tell certain user agents not to crawl anything
(basically, some pages with enormous amounts of data should never be crawled; and some voracious but useless search engines, e.g. Cuil, should never crawl…