How to interpret number of URL errors in Google webmaster tools
Posted
by
user359650
on Pro Webmasters
See other posts from Pro Webmasters
or by user359650
Published on 2012-03-19T11:27:24Z
Indexed on
2012/03/19
18:15 UTC
Read the original article
Hit count: 374
Recently Google has made some changes to Webmaster tools which are explained below: http://googlewebmastercentral.blogspot.com/2012/03/crawl-errors-next-generation.html
One thing I could not find out is how to interpret the number of errors over time. At the end of February we've recently migrated our website and didn't implement redirect rules for some pages (quite a few actually). Here is what we're getting from the Crawl errors
:
What I don't know is if the number of errors is cumulative over time or not (i.e. if Google bots crawl your website on 2 different days and find 1 separate issue on each day, whether they will report 1 error for each day, or 1 for the 1st, and 2 for the 2nd).
Based on the Crawl stats
we can see that the number of requests made by Google bots doesn't increase:
Therefore I believe the number of errors reported is cumulative and that an error detected on 1 day is taken into account and reported on the subsequent days until the underlying problem is fixed and the page it's crawled again (or if you manually Mark as fixed
the error) because if you don't make more requests to a website, there is no way you can check new pages and old pages at the same time.
Q: Am I interpreting the number of errors correctly?
© Pro Webmasters or respective owner