Thousands of 404 errors in Google Webmaster Tools
- by atticae
Because of a former error in our ASP.Net application, created by my predecessor and undiscovered for a long time, thousands of wrong URLs where created dynamically. The normal user did not notice it, but Google followed these links and crawled itself through these incorrect URLs, creating more and more wrong links.
To make it clearer, consider the url
example.com/folder
should create the link
example.com/folder/subfolder
but was creating
example.com/subfolder
instead. Because of bad url rewriting, this was accepted and by default showed the index page for any unknown url, creating more and more links like this.
example.com/subfolder/subfolder/....
The problem is resolved by now, but now I have thousands of 404 errors listed in the Google Webmaster Tools, which got discovered 1 or 2 years ago, and more keep coming up.
Unfortunately the links do not follow a common pattern that I could deny for crawling in the robots.txt.
Is there anything I can do to stop google from trying out those very old links and remove the already listed 404s from Webmaster Tools?