Robots.txt help
Posted
by
Kyle R
on Stack Overflow
See other posts from Stack Overflow
or by Kyle R
Published on 2011-01-13T08:43:41Z
Indexed on
2011/01/13
8:53 UTC
Read the original article
Hit count: 172
robots.txt
Google have just thrown up thousands of errors for duplicate content on my link tracker I am using.
I want to make it so Google & any other search engines do not visit my pages on the link tracker.
I want these pages to disallow these robots, my pages are:
http://www.site.com/page1.html
http://www.site.com/page2.html
How would I write my robots.txt to make all robots not visit these links when they are in my page?
© Stack Overflow or respective owner