How to resolve "Google can't find your site's robots.txt" error?
- by Manivasagam
I've recently found that "Google can't find your site's robots.txt" in crawl errors. When I tried Fetching as Google, I got result "SUCCESS", then I tried looking at crawl errors and it still shows "Google can't find your site's robots.txt".
What can I do to resolve this issue? Before this issue arose, my site was indexed within a few mintues, but now I find that it took time to be indexed in Google's search.
When I access http://mydomain.com/robots.txt, it shows the data below:
User-agent:
*Disallow: /wp-admin/
Disallow: /wp-includes/
I found Blocked URLs = 0, also no any other errors.
Is there any other thing I need to change? Or what could be the solution for this?
Any help would be appreciated.