URL blocked in robots.txt but still showing up on Google search [closed]
- by Ahmad Alfy
Possible Duplicate:
Why do Google search results include pages disallowed in robots.txt?
In my robots.txt I am disallowing a lot of URLs. Google webmaster tools says there're +750 URL blocked. The problem is the URLs are still showing on Google search.
For example I have the following rule:
Disallow: /entity/child-health/
But when I search some-keyword + child health the following URL shows up :
http://www.sitename.com/entity/child-health/
Am I doing anything wrong? Is is possible for a URL to be blocked using robots.txt and still show up on search results?