How should I handle pages that move to a new url with regards to search engines?
- by Anders Juul
Hi all,
I have done some refactoring on a asp.net mvc application already deployed to a live web site. Among the refactoring was moving functionality to a new controller, causing some urls to change. Shortly after the various search engine robots start hammering the old urls.
What is the right way to handle this in general?
Ignore it? In time the SEs should find out that they get nothing but 400 from the old urls.
Block old urls with robots.txt?
Continue to catch the old urls, then redirect to new ones? Users navigating the site would never get the redirection as the urls are updated through-out the new version of the site. I see it as garbage code - unless it could be handled by some fancy routing?
Other?
As always, all comments welcome...
Thanks,
Anders, Denmark