Fix 403 errors in Google Webmaster Tools
- by Justin
Hi Team,
I have a domain that has "fallen off a cliff" for searches in Google. Searches that used to be in position 1-4 are now gone from page 1. The same search in Bing shows the typical position expected (top 5 results).
In reviewing Google Webmaster Tools, I am seeing two problems:
1. The Sitemap is reporting two errors:
General HTTP error: HTTP 403 error (Forbidden)
URLs not accessible
However, the URL they provide as "no accessible" is accessible. I can click the link Google provides and it works fine.
There are 6,000 crawl errors of type 403. Again, most of these pages that have 403 are accessible in my browser (tried various browsers as well). About half are from January, the other half from November.
There are no IP-specific firewall rules on ports 80 and 443 that could block the goolgebot
Using the user agent switcher add-on for FF I confirmed that the page loads when the user agent is the googlebot
I an confirm that most of the pages reported as 403 are accessible.
A search of just "site:thedomain.com" does confirm there are over 9,000 in the index. But most searches don't return the site.
I believe the 403 issues are the cause of the fall in search rankings, but I can't seem to find any information online with ideas about how to address this.
Any ideas?
jpe