Robots.txt help
- by Kyle R
Google have just thrown up thousands of errors for duplicate content on my link tracker I am using.
I want to make it so Google & any other search engines do not visit my pages on the link tracker.
I want these pages to disallow these robots, my pages are:
http://www.site.com/page1.html
http://www.site.com/page2.html
How would I write my robots.txt to make all robots not visit these links when they are in my page?