How to create robots.txt for a domain that contains international websites in subfolders?
- by aaandre
Hi, I am working on a site that has the following structure:
site.com/us - us version
site.com/uk - uk version
site.com/jp - Japanese version
etc.
I would like to create a robots.txt that points the local search engines to a localized sitemap page and has them exclude everything else from the local listings.
So, google.com (us) will index ONLY site.com/us and take in consideration site.com/us/sitemap.html
google.co.uk will index only site.com/uk and site.com/uk/sitemap.html
Same for the rest of the search engines, including Yahoo, Bing etc.
Any idea on how to achieve this?
Thank you!